Join WhatsApp
Join Now
Join Telegram
Join Now

Advanced Linux Logging: Parsing JSON Logs with the jq Command.

Avatar for Noman Mohammad

By Noman Mohammad

Published on:

Your rating ?

Hey, let me tell you what `jq` did for my logs

Picture me at 2 a.m. last month, staring at a 600 MB JSON log dump. My coffee’s gone cold. The alert on the dashboard screams “Payment service is failing.” I open the file and… just… wow. Pages and pages of nested JSON, all tiny fonts, no breaks. My first instinct? I closed the laptop.

I couldn’t afford that luxury. Twenty minutes of panicked grep -i left me with 1,438 useless matches. At that point I shouted — “There has to be a smarter way!”

The one-minute pitch for jq

Turns out, jq was sitting right on my system already. It’s small, free, and speaks JSON like a native. If you know English, you can talk to it. Don’t gloss over it—this tool can replace an entire dashboard if you treat it right.

  • Install it in one breath (Ubuntu? sudo apt install jq. Done).
  • Type jq '.' myfile.json and it’ll color, indent, and beautify the mess instantly.

No GUI lag. No loading spinners. Just raw speed.

Zero-to-hero cheat-sheet

Here’s the stuff I now run daily, copy-pasted clean:

  1. Pretty-print
    jq '.' app_logs.json
  2. Pull just the good bits (time + message only)
    jq '.timestamp + ": " + .message' app_logs.json
  3. Filter errors in auth-api from last hour
    jq -r 'select(.level =="error" and .service=="auth-api" and (.timestamp | startswith("2025-08-26T10")) ) | "\(.timestamp) -> \(.message)"' app_logs.json
  4. Real-time tail
    tail -f app_logs.json | jq 'select(.level=="error")'
  5. Quick count of each error type
    jq '[select(.level=="error") | .message] | unique | length' app_logs.json

A small story on speed

Last week a customer complained: “Slow page loads in Singapore.” I hadn’t touched the new Asia-Pacific cluster before. Logs zipped up to 1.2 GB since the morning. I used one pipeline:

zcat ec2-singapore-log.gz |
  jq -r '
    (.timestampEpoch | tonumber) as $time |
    select($time > now - 3600) |
    select(.url | contains("order/checkout")) |
    "\($time), \(.latencyMS), \(.user.ip)"' |
  awk '$2 > 1000' |
  head -20

Twenty slow requests appeared in three seconds. No queries. No dashboards. I scrolled down the list, spotted every IP behind a bad firewall rule, fixed the VPC route, and closed the ticket in under ten minutes-flat. My teammate still won’t believe it.

Three tiny super-tricks nobody taught me

  • If an entry is malformed, jq can skip it gracefully:
    jq 'try . catch "BROKEN"' chunky.log
  • Big gigabyte files? Stream them chunk-by-chunk with jq --stream so your laptop does not freeze into a brick.
  • You can output straight to CSV for Excel-happy managers:
    jq -r '[.timestamp,.level,.service,.message] | @csv' app_logs.json > daily.csv

The busy-sysadmin install note

The machine you’re on probably already ships jq. Still, here’s the one-liner for my tired friends:

which jq || { apt update -qq && apt install -y jq || yum install -y jq; }

Copy-paste, hit enter, enjoy.

Recap in seven seconds

Quit drowning in raw log spaghetti. One tool. Short commands. Instant clarity. jq is the flashlight in the JSON cave—once you hold it, you can’t go back to the dark.

I learned this the hard way. Save yourself 2 a.m. tears. Give jq ten minutes; it’ll pay you back ten-fold.

Leave a Comment