How integrating data sources led a manufacturer to more consistent chicken size and larger profits
Integrating data sources is, in my humble opinion, one of the most under-utilized tools in a tech stack. Let me explain why… with chickens.
One of the companies I used to work with when I was in the data warehousing field was a large chicken processing company. Think the type of chicken you buy at stores like Target and Costco.
I’m a huge fan of the rotisserie chickens at Costco. They are absolutely massive, perfectly salty, and for $5, you can’t complain. My boyfriend actually owns a Costco membership specifically because of the delicious-yet-cheap rotisserie chicken, and more importantly, he values the consistency. This is a massive advantage and a selling point for Costco.
Here’s the kicker: the consistency of Costco’s rotisserie chicken is the result of integrated data sources. This winds back to the chicken processing company, and the data they are using around farming.
This chicken supplier company that I was working with was having issues with chicken weight, and they didn’t know why. Not coincidentally too, they had limited production data around their manufacturing process.
When chicken weight isn’t within a certain range, this company incurs big losses. High-paying retailers, like Costco, only buy chicken that are within a specific weight range. If the chickens are too small or too big, they can’t be used for rotisserie, and the whole “consistency” selling point goes down the drain.
With high-paying retailers out the door, chicken manufacturers have to turn to lower-paying companies, like dog food producers. This was a big financial hit, and the unpredictability continued to hurt their bottom-line for years.
After researching their options, this company decided to take a bet on data. They couldn’t find many patterns from the little internal datasets they had, so they integrated third party weather data. Yes, you read that right — weather data!
Turns out, inclement weather ruined chicken appetite.
When it got too hot, the chickens lost their appetite, ate less, weighed less, and incurred losses for the company. Similarly, very cold weather (and shorter days) led to less egg-laying, which meant fewer fully grown chickens 3–6 months down the line.
By integrating weather data, this company was able to predict variations in chicken weight and reduce financial losses.
Companies face a paradox of choice around data and SaaS — with so many options to build a tech stack, it’s easy to get lost in the sauce of collecting the coolest tools. From my experience, the best data solutions often don’t rely on super advanced tech. They rely on a good data roadmap, and cool tech is the cherry on top.
Focusing on buzzwords like machine learning and artificial intelligence without having a clear picture of your data is like trying to speed when there’s no road to drive on.
Look at process before tools. Sometimes, a connected pipeline or new code to import data is the most practical data product, and builds a strong base for future data initiatives(and larger chickens).
Thanks to our support team, our customers can feel like Statsig is a part of their org and not just a software vendor. We want our customers to know that we're here for them.
Migrating experimentation platforms is a chance to cleanse tech debt, streamline workflows, define ownership, promote democratization of testing, educate teams, and more.
Calculating the right sample size means balancing the level of precision desired, the anticipated effect size, the statistical power of the experiment, and more.
The term 'recency bias' has been all over the statistics and data analysis world, stealthily skewing our interpretation of patterns and trends.
A lot has changed in the past year. New hires, new products, and a new office (or two!) GB Lee tells the tale alongside pictures and illustrations:
A deep dive into CUPED: Why it was invented, how it works, and how to use CUPED to run experiments faster and with less bias.