-
Notifications
You must be signed in to change notification settings - Fork 317
Add better documentation for handling batch mappings with parameters #1168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
We've discussed this problem in a few issues,#232 is a good example but there are others. Let's use this issue to work on some documentation in Spring for GraphQL on this. Sorry I initially missed your question on StackOverflow. I'll paste the answer here as well: The batching feature in graphql-java and Spring for GraphQL is not "just" a way to work around the N+1 issue for a single data fetcher. This is a more general mechanism for loading elements in batches and caching their resolution for the lifetime of the GraphQL request. More specifically, the Batch loading functions do have access to the For your case, I would say that there are two possible approaches: fetching then filtering, or doing a tailored fetch. Let's use the following schema for this example: type Query {
me: Person
people: [Person]
}
input FriendsFilter {
favoriteBeverage: String
}
type Person {
id: ID!
name: String
favoriteBeverage: String
friends(filter: FriendsFilter): [Person]
} Fetching then filteringOne approach would be to fetch all friends for a given person, possibly caching their values for the entire lifetime of the GraphQL request. @Controller
public class FriendsController {
private final Map<Integer, Person> people = Map.of(
1, new Person(1, "Rossen", "coffee", List.of(2, 3)),
2, new Person(2, "Brian", "tea", List.of(1, 3)),
3, new Person(3, "Donna", "tea", List.of(1, 2, 4)),
4, new Person(4, "Brad", "coffee", List.of(1, 2, 3, 5)),
5, new Person(5, "Andi", "coffee", List.of(1, 2, 3, 4))
);
public FriendsController(BatchLoaderRegistry registry) {
registry.forTypePair(Integer.class, Person.class).registerMappedBatchLoader((personIds, env) -> {
// fetch all friends and do not apply filter, caching Person by their id
Map<Integer, Person> friends = new HashMap<>();
personIds.forEach(personId -> friends.put(personId, people.get(personId)));
return Mono.just(friends);
});
}
@QueryMapping
public Person me() {
return this.people.get(2);
}
@QueryMapping
public Collection<Person> people() {
return this.people.values();
}
@SchemaMapping
public CompletableFuture<List<Person>> friends(Person person, @Argument FriendsFilter filter, DataLoader<Integer, Person> dataLoader) {
// load all friends THEN apply the given filter
return dataLoader
.loadMany(person.friendsId())
.thenApply(filter::apply);
}
public record Person(Integer id, String name, String favoriteBeverage, List<Integer> friendsId) {
}
public record FriendsFilter(String favoriteBeverage) {
List<Person> apply(List<Person> friends) {
return friends.stream()
.filter(person -> person.favoriteBeverage.equals(this.favoriteBeverage))
.collect(Collectors.toList());
}
}
} In practice, this request: query {
me {
name
friends(filter: {favoriteBeverage: "tea"}) {
name
favoriteBeverage
}
}
people {
name
friends(filter: {favoriteBeverage: "coffee"}) {
name
favoriteBeverage
}
}
} Will yield: {
"data": {
"me": {
"name": "Brian",
"friends": [
{
"name": "Donna",
"favoriteBeverage": "tea"
}
]
},
"people": [
{
"name": "Andi",
"friends": [
{
"name": "Rossen",
"favoriteBeverage": "coffee"
},
{
"name": "Brad",
"favoriteBeverage": "coffee"
}
]
},
{
"name": "Brad",
"friends": [
{
"name": "Rossen",
"favoriteBeverage": "coffee"
},
{
"name": "Andi",
"favoriteBeverage": "coffee"
}
]
},
{
"name": "Donna",
"friends": [
{
"name": "Rossen",
"favoriteBeverage": "coffee"
},
{
"name": "Brad",
"favoriteBeverage": "coffee"
}
]
},
{
"name": "Brian",
"friends": [
{
"name": "Rossen",
"favoriteBeverage": "coffee"
}
]
},
{
"name": "Rossen",
"friends": []
}
]
}
} Note: we have here two different operations fetching friends with different filters, but they're both using the batch loading function.
Tailored fetchLet's try and just fetch the values we need. @Controller
public class FriendsController {
private final Map<Integer, Person> people = Map.of(
1, new Person(1, "Rossen", "coffee", List.of(2, 3)),
2, new Person(2, "Brian", "tea", List.of(1, 3)),
3, new Person(3, "Donna", "tea", List.of(1, 2, 4)),
4, new Person(4, "Brad", "coffee", List.of(1, 2, 3, 5)),
5, new Person(5, "Andi", "coffee", List.of(1, 2, 3, 4))
);
public FriendsController(BatchLoaderRegistry registry) {
// we're now using a composed key
registry.forTypePair(FriendFilterKey.class, Person[].class).registerMappedBatchLoader((keys, env) -> {
// perform efficient fetching by delegating the filter operation to the data store
Map<FriendFilterKey, Person[]> result = new HashMap<>();
keys.forEach(key -> {
Person[] friends = key.person().friendsId().stream()
.map(people::get)
.filter(friend -> key.friendsFilter().matches(friend))
.toArray(Person[]::new);
result.put(key, friends);
});
return Mono.just(result);
});
}
@QueryMapping
public Person me() {
return this.people.get(2);
}
@QueryMapping
public Collection<Person> people() {
return this.people.values();
}
@SchemaMapping
public CompletableFuture<Person[]> friends(Person person, @Argument FriendsFilter filter, DataLoader<FriendFilterKey, Person[]> dataLoader) {
return dataLoader.load(new FriendFilterKey(person, filter));
}
public record Person(Integer id, String name, String favoriteBeverage, List<Integer> friendsId) {
}
public record FriendsFilter(String favoriteBeverage) {
boolean matches(Person friend) {
return friend.favoriteBeverage.equals(this.favoriteBeverage);
}
}
// because this key contains both the person and the filter, we will need to fetch the same friend multiple times
public record FriendFilterKey(Person person, FriendsFilter friendsFilter) {
}
}
Note: we can't consider a "me": {
"name": "Brian",
"friends": [
{
"name": "Donna",
"favoriteBeverage": "tea"
},
null // Rossen is filtered out
]
}, There is more to this, as we could also explain how context keys can be used to pass a per-key context. The goal is not to replace the @tstocker-black-cape Please let us know if this comment helps you and if it needs some clarification. We can then consider adding a new section in the documentation. Thanks! |
@bclozel Thanks so much for taking the time to help me out with this! I appreciate your example. It definitely helped me wrap my head around how everything goes together better. I'm privileged to be able to "push the gas and make it go" for the most part thanks to spring boot. Your second example "Tailored Fetch" is what I'm going for, but you side-stepped a key aspect of my issue by using a native array instead of a typed list. Type erasure is the root of my issue at the moment. To demonstrate this, I've created an example project that builds off the "People" API we've been using. Some key changes are
If you run the only test in
|
Thanks for the detailed sample @tstocker-black-cape The main issue here is that your batch loader are registered against the
In general I don't this changes much the documentation to be added, this is more of a combination with pagination. |
Thanks for taking the time to look at this!
If you feel that there is no action to take here, feel free to close the ticket. FWIW, I have found a way to achieve my goals using a name in the batch loader registry, so thank you for your example code! I still maintain that filtered sub-selections (schema mappings) are a common occurrence in GraphQL API's and it would be amazing if Spring GraphQL supported them without running into the N+1 problem. If you ever find a way to enhance the @BatchMapping annotation to work in that way, I'll be the first to adopt it. Thanks again for taking the time with me! |
Fair enough. I was merely pointing to the fact that Spring for GraphQL supports the popular relay connection spec and that it would greatly simplify your codebase and schema design. But I understand that you might need to stick to an existing schema that uses a different pagination mechanism.
The main goal behind using batch loading is to not only load many entities at once, but also not loading the same entity twice. By default, the data loader will cache in memory loaded entities for the entire duration of the request. Given the size of your dataset, I understand that loading all connected types in the graph is not an option. By using specific filtered keys, you will most likely load the same entity (
I think it would be beneficial for us to improve our reference documentation to explain a bit why batch loading is different from queries and mutations when it comes to the environment. We certainly can't get into specific details about design and tradeoffs; this would probably belong in a tutorial or a blog post, not our reference documentation.
I'm glad you managed to work things out. We do recognize that this is a common pattern. But as we can see, Thanks for the discussion and bringing this up in the first place! |
Great discussion here, and good documentation improvements to come. I'll just mention a couple of things.
It is also worth noting that our support for pagination is flexible, and pluggable. For example, instead of
The |
I submitted a stackoverflow question with the spring graphql tag that you can find here.
Given a GraphQL schema that contains data like the following
I can create a schema mapping in my controller which looks like the following
However, this runs us into the N+1 problem. So to solve that, we need to batch. What I would expect to be able to do is modify my code to the following
I have found that spring graphql does not support this kind of thing. While this kind of support would be ideal, I'm willing to work around it, but all the other answers I'm finding lose the type information and attempt to register a mapper for the pair Person.class, List.class which is insufficient as I have two fields that are both lists. What exactly is the simplest and most correct way forward here? I have to solve the N+1 problem and I have to preserve the filtering functionality of my API.
I've tried reading through the closed issues asking for this feature and I still haven't quite found the answer I'm looking for. I could really just use some help finding the right thing to do in this case where a filter is required and we can't sacrifice the typing of List.
The text was updated successfully, but these errors were encountered: