My Local Falcon HTTP calls to run-scan are apparently giving Lindy too much data. Is there anything that can be done about this? If not, I need to move this whole workflow to N8N. >> Task URL <<
I'd love to learn more about how you are adding Local Falcon to your workflows, btw!
Oh hey David f.! So happy to hear that this is being fixed! I was about to have to switch to N8n next week. I'll test it on Tuesday and report back if it's processing the data from scans with no issues. I'm adding LF through HTTP requests: I have a workflow where a new lead comes in from a conference intake form, recorded in GoHighLevel -> triggers webhook to Lindy -> does a bunch of enrichment stuff incl Google Maps search (if multi-location, finds all of them) -> runs HTTP req's to LocalFalcon for find -> add -> scan -> update a spreadsheet with the data Check it out here
That's awesome! I've always felt LF geo-grids would be the perfect warm intro for an email since the data is so personalized to the individual.
Yeah we've been using LF scans as part of our sales nurture process for years now!
David f. Alon J. Still getting this error: Task failed due to a data size limit. This is usually caused by an action returning too much data. Please check your node settings and try again. We have a huge conference starting Sunday this week. I built a huge enrichment automation for new conference leads, I really need to get this fixed by tomorrow otherwise it gives me barely enough time to switch the whole workflow to N8N.
Hey Sinan! Hmmm, I believe the information being passed is up to the api endpoint. We don't have control over it as far as I'm aware. You potential option could be to use a code block to call the api and filter the output using the code. I haven't tested this out, however!
Alon J. The issue is that the output is too large and Lindy can't process it due to a data size limit. I can't add anything that fixes this, as the workflow stops the moment this error is hit. David f. hoping you can weigh in on this. I'm on a super short notice (by tomorrow) to get this resolved or start re-building this in N8N.
Alon J. David f. I'm trying to reduce the "places" and/or "data" part of the payload (screenshot). I think that's what's hitting the data limit. David the payload's so large I can't even paste it into Google Docs and I have a brand new gaming laptop lol I was trying to show it to you guys in GDocs. Any thoughts how I can reduce the output payload size via the Body configuration? Here's my body config:
--boundary123
Content-Disposition: form-data; name="api_key"
6f2f2de405f962711b56894458a8606d
--boundary123
Content-Disposition: form-data; name="place_id"
ChIJeSAqB2S_yIARsAiphLf2SLs
--boundary123
Content-Disposition: form-data; name="keyword"
botox near me
--boundary123
Content-Disposition: form-data; name="lat"
36.1443959
--boundary123
Content-Disposition: form-data; name="lng"
-115.2577815
--boundary123
Content-Disposition: form-data; name="grid_size"
11
--boundary123
Content-Disposition: form-data; name="radius"
5
--boundary123
Content-Disposition: form-data; name="measurement"
mi
--boundary123
Content-Disposition: form-data; name="platform"
google
--boundary123--
Okay I reduced the miles to 3 and grid size to 9, I got the same error but now on my next node: Task failed due to a data size limit. This is usually caused by an action returning too much data. Please check your node settings and try again. This is output from my next node: { "access_token": "ya29.a0AS3H6NzBJXKl5__IVcDl_wv65kEALY3hsRPWRPOMyziwSLH46FbQOLH0MXVK8jyIzdhrTUnDg2j-wlRwwjOjK_ONgymArH0nkTjsQ3awJ5FoRwxioREgZIjoDoiXaxso5FVyL5lHXzqRfVmeP95E1bY_ueGcKfgTOxn3o6jD3QaCgYKAZgSARASFQHGX2MixDXdeX9v77piWzEMlaMpdA0177", "expires_in": 3599, "scope": "https://www.googleapis.com/auth/drive https://www.googleapis.com/auth/spreadsheets", "token_type": "Bearer" } ^ Is the "data size limit" mentioned in the error referring to the size of the entire task?
Could I please get some dev support in here 🥹😢
But to answer your question I am pretty sure the data size is likely a limit caused on our end not yours
Sinan M. We have updated our documentation for retrieving a specific scan report with targeting report data here: https://docs.localfalcon.com/#de29721a-df0d-4ff0-a647-8a384588b3ec You can add a parameter called "fields" and set the value to any of the following: rankings, places, data_points, ai_analysis, scan_summary