All articles

Parse JSON in Power Automate Without Losing Your Mind

The Parse JSON action confuses everyone. Here's the no-nonsense guide to schemas, nested properties, null handling, and when to skip it entirely.

· 6 min read

I have personally helped at least 20 people with this exact action. Colleagues, clients, people on forums. The conversation always starts the same way: “My flow was working and then Parse JSON just… broke.”

Parse JSON is one of the most used actions in Power Automate and somehow also one of the most confusing. It shouldn’t be. The action does one thing: it takes a blob of text that happens to be JSON and turns it into named properties you can reference in later steps. That’s it.

But the schema system, the error messages, and the way it handles missing data make it feel way harder than it should be. Let’s fix that.

What Parse JSON Actually Does

When you get a response from an HTTP call, a connector, or a Compose action, Power Automate often treats it as a generic object or a plain string. You can see data in the run history, but when you try to reference specific fields in later actions, they don’t show up in the dynamic content panel.

Parse JSON solves this. You give it:

  1. Content — the JSON data (from a previous action’s output)
  2. Schema — a description of what the JSON looks like

Once configured, all the properties defined in your schema appear as dynamic content in downstream actions. Instead of writing expressions to dig into the data, you just click field names.

Generating the Schema (The Right Way)

Nobody writes these schemas by hand. Here’s the process:

  1. Run your flow once so you have a real response
  2. Copy the JSON output from the run history
  3. In the Parse JSON action, click Generate from sample
  4. Paste your sample JSON
  5. Click Done

Power Automate generates the schema for you. It works about 80% of the time.

The other 20%? That’s where the pain starts. The generator infers types from your sample data, and if your sample has a null where a string usually goes, or a number where you sometimes get a string, the schema will be wrong — and your flow will fail on real data.

Always review the generated schema. Look for these problems:

  • Properties typed as "type": "integer" that could sometimes be null
  • Properties typed as "type": "string" that are actually numbers in some responses
  • Arrays that were empty in your sample (they’ll show as "items": {} with no type info)

Handling Null and Missing Properties

This is the number one reason Parse JSON fails in production. Your sample data had every field populated. Real data doesn’t.

When a property is null and your schema says "type": "string", Parse JSON throws an error:

Invalid type. Expected String but got Null.

The fix: allow multiple types in your schema. Change this:

"email": {
  "type": "string"
}

To this:

"email": {
  "type": ["string", "null"]
}

Do this for every property that could ever be null. Yes, it’s tedious. Yes, it’s necessary. I usually edit the schema directly in the JSON editor and do a find-and-replace: change "type": "string" to "type": ["string", "null"] for any field that isn’t guaranteed to have a value.

For integer fields that could be null:

"quantity": {
  "type": ["integer", "null"]
}

Arrays vs. Objects

This trips people up constantly. A quick refresher:

Object — a single thing with named properties, wrapped in { }:

{
  "name": "Contoso",
  "city": "Seattle"
}

Array — a list of things, wrapped in [ ]:

[
  { "name": "Contoso", "city": "Seattle" },
  { "name": "Fabrikam", "city": "Dallas" }
]

If your JSON is an array at the top level, your schema needs "type": "array" at the root, not "type": "object". The Generate from sample button handles this correctly most of the time, but double-check.

After parsing an array, you won’t see individual fields in the dynamic content panel until you’re inside an Apply to each loop. That’s by design — Power Automate doesn’t know which item you want until you iterate.

Accessing Nested Properties

Say your parsed JSON looks like this:

{
  "account": {
    "contact": {
      "firstName": "Jane",
      "lastName": "Smith"
    }
  }
}

After parsing, the dynamic content panel shows firstName and lastName as clickable fields. Behind the scenes, it generates expressions like:

body('Parse_JSON')?['account']?['contact']?['firstName']

The ? operator is important — it’s the safe navigation operator. If account is null, the expression returns null instead of throwing an error. If you write your own expressions to access nested properties, always use ?['propertyName'] instead of ['propertyName'].

For deeply nested data (3+ levels), I find it cleaner to use multiple Parse JSON actions — one for the outer object and one for the inner structure — rather than one giant schema. It makes the flow easier to read and easier to debug.

Common Errors and How to Fix Them

”Invalid type. Expected String but got Null”

Already covered above. Add "null" to the type array in your schema.

”Invalid type. Expected String but got Integer”

Your API returned a number where the schema expected a string. Either update the schema to allow both types:

"type": ["string", "integer", "null"]

Or, if the value should always be a string, fix it upstream with a Compose action:

string(outputs('HTTP')?['body']?['id'])

“ActionFailed. An action failed. No dependent actions succeeded.”

This usually means the Content input to Parse JSON isn’t valid JSON at all. Check the run history — look at the raw input. Common causes:

  • The previous action returned an error page (HTML, not JSON)
  • The JSON is wrapped in extra quotes, making it a string instead of an object
  • The response is empty

For the double-quoted string problem, use the json() function to convert it:

json(body('HTTP'))

Schema validation passes but dynamic content is empty

This happens when your JSON is valid but doesn’t match the property names in your schema. JSON property names are case-sensitive. "FirstName" in the data won’t match "firstName" in the schema.

When to Skip Parse JSON Entirely

Here’s something a lot of Power Automate guides won’t tell you: you don’t always need Parse JSON. For simple cases, expressions work fine and save you an action.

Use the json() function

If you have a JSON string and just need one or two values from it:

json(body('HTTP'))?['status']
json(triggerBody()?['data'])?['customer']?['name']

No schema. No validation. Just grab the value.

Use coalesce() for defaults

When a property might be null and you want a fallback:

coalesce(json(body('HTTP'))?['email'], 'no-email@contoso.com')

coalesce() returns the first non-null value. It saves you from building branching logic around null checks.

Use Compose + expressions for quick transforms

A Compose action with a json() expression is often all you need:

json(body('HTTP'))

Then in later steps, reference it with:

outputs('Compose')?['propertyName']

You lose the nice dynamic content panel, but you avoid the schema validation headaches. For flows where the API response changes frequently, this is sometimes the pragmatic choice.

When You Should Use Parse JSON

Skip the expressions-only approach when:

  • Multiple people maintain the flow — dynamic content is easier for non-developers to understand than raw expressions
  • The JSON has a complex nested structure — clicking through dynamic content beats writing outputs('Compose')?['data']?['records']?['attributes']?['name'] by hand
  • You need to iterate over an array — Apply to each works cleanly with parsed output
  • You want the flow to fail early — schema validation catches bad data at the Parse JSON step instead of four actions later when an expression returns unexpected results

Quick Reference

SituationApproach
Need 1-2 fields from simple JSONjson() expression directly
Complex/nested structureParse JSON with reviewed schema
Properties might be nullAdd "null" to type arrays in schema
JSON comes as a quoted stringWrap in json() before parsing
API response shape changes oftenCompose + expressions, skip schema
Team maintains the flowParse JSON for readable dynamic content

Parse JSON isn’t complicated once you understand that the schema is the source of almost every problem. Generate it from real data, allow nulls everywhere they could appear, and check your property name casing. For simple cases, skip it and use expressions. That covers about 95% of the issues I’ve seen.

Share this article LinkedIn X / Twitter

Related articles

Your First Approval Flow

A step-by-step guide to building approval flows in Power Automate, including connector provisioning, approval types, custom email content, and handling the response — plus the errors that block most first-time setups.

· 9 min read

Power Automate Formatting Cheat Sheet

formatDateTime, convertTimeZone, formatNumber — these expressions handle 90% of formatting needs in Power Automate. Here's the syntax, common format strings, and fixes for the errors that come up.

· 9 min read