Day 63 of 100 Days of AI

I’m now 69% of the way through the AI agents course, and I added a new idea to the list of side projects I’d like to build after the course. I look forward to this because I can already see how agents can automate both personal and work-related tasks.

Big Data is Dead?

I also came across this post today about big data. It’s written by an engineer who worked at BigQuery. The post argues that most organisations don’t need big data solutions (e.g. NoSQL databases like MongoDB or specialist data warehouses like Google BigQuery or Snowflake).

The illustration below from this engineer shows what he remembers from his time working with “big data”. He found that most customers had less than 1 terabyte of data and that among users who spent more than $1,000 a year on BigQuery, 90% of their queries involved less than 100 MB of data.

This begs the question of why many organisations bother investing in big data infrastructure. “Big Data” usually refers to data sizes in the range of hundreds of terabytes (TB) to petabytes (PB). Only a tiny fraction (the top 1%) of organisations actually handle data of this scale.