Best Practice deserialize a JSON to avoid heap size limit

I’m sending an HTTP request to get a JSON of Students. The JSON is a complex JSON with arrays and its break into 3 objects in Salesforce.

I’m using a class (JSON2APEX) to deserialize the JSON.

Some of the time the JSON is very big, and when I get it into the HttpResponse param and deserialize it to the class I created Its hit the Heap size limit.

  Http h = new Http();
        HttpRequest httpReq = new HttpRequest();
        httpReq.setMethod('GET');
        httpReq.setTimeout(120000);
        httpReq.setHeader('Content-Type','application/json'); 
        httpReq.setEndpoint('https://endPoint/students');
        System.debug(Limits.getHeapSize()); 0.001234 MB
        HttpResponse res = h.send(httpReq);
        System.debug(Limits.getHeapSize()); 6.5 MB

RootObject data = (RootObject)JSON.deserialize(res.getBody(),RootObject.class);

//Here its break – and the LIMIT is almost 13 MB.

So I can’t really do some heap cleaning because Its break before I could clear the res param.

there is another way to do that?

I want to know if there is any solution I can do here, without asking the external System (3rd party) to break the JSON…

Until here @sfdcfox gave the solution that solves my issue. Yet, I would like like to hear a general solution to bigger JSON, that those tricks do not helping.

Continuity of the question:

I have a managed package (“Heda”) – SF package for university usage.

when I upsert many contacts its throws me an error:

System.LimitException: hed:Too many SOQL queries: 201.

I’m running only 3 SOQL queries in my process and I can see it in the debugging at the end of my code (System.debug('getQueries() -> ' + Limits.getQueries());)

(I’m running that in qeueuable apex so It gives me more queries to spend).

I cant see the code under the hood that “HEDA” package using, but which more ways I can solve this issue when the JSON payload is too big? or when we need to pay attention to the limit along with other packages that installed in our orgs.

What are the best ways to do, if the 3rd party services cant changes the services and send a smaller JSON or do it in parts for example…

Many many Thanks!

Answers:

Thank you for visiting the Q&A section on Magenaut. Please note that all the answers may not help you solve the issue immediately. So please treat them as advisements. If you found the post helpful (or not), leave a comment & I’ll get back to you as soon as possible.

Method 1

The heap limit is more of a “suggestion” than a hard rule. Having 13 MB of used heap isn’t a big deal if you only do so briefly. However, there is a hard limit for strings; one single string cannot exceed the heap size limit. Similarly, JSON.deserialize (and related methods) will automatically fail if the string’s size exceeds the total heap size (regardless of actual heap usage at the moment it is called).

That said, one trick you can try is to bypass the heap entirely:

HttpRequest httpReq = new HttpRequest();
httpReq.setMethod('GET');
httpReq.setTimeout(120000);
httpReq.setHeader('Content-Type','application/json'); 
httpReq.setEndpoint('https://endPoint/students');
RootObject data = (RootObject)JSON.deserialize(new Http().send(httpReq).getBody(),RootObject.class);

This will bypass the interim heap limit, although if the string is too large, you’ll still hit the limit. In this case, consider using Asynchronous code (future, Queueable) to get the higher heap limit of 12MB, which will allow you to parse more data.

Method 2

Try https://github.com/Trigger2991/SFDC-JSON-Parser 🙂

It seems less heavy, but then you have to get attributes with code

You could also split elements using Util_JsonParser then call JSON.deserialize on each of them to avoid head size limit


All methods was sourced from stackoverflow.com or stackexchange.com, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x