My Candid Take: Being a Meta Data Science Intern

I’m Kayla. I spent one summer as a Data Science intern at Meta, on the Instagram side. I sat in Menlo Park. Lots of bikes. Lots of cold brew. And yes, the food was very good. But you want the real stuff, right? What I did, what worked, and what didn’t.
If you'd like to compare this write-up with another honest perspective, take a look at this detailed breakdown of a Meta Data Science internship.

What I actually worked on

I didn’t just make decks. I shipped things. Three projects stood out.

  1. Notification “cooldown” test for Reels creators
  • Problem: Some creators got too many pings. They felt spammed.
  • What I did: I wrote SQL in Presto to find high-risk groups (creators posting 5+ times a day).
  • I built a simple rule-based throttle with our PM and an engineer.
  • I ran an A/B test in our experiment tool (think PlanOut style).
  • Result: We saw an 8% drop in complaint tickets and no real drop in posts. That balance felt rare and sweet.
  1. Forecast for daily active users in Brazil
  • I used Python, pandas, and Prophet to forecast DAU for Reels in BR.
  • I pulled data from Hive via Presto.
  • I made a small dashboard in Superset so the team could watch week by week.
  • We caught a holiday dip early (Carnaval week), which saved some “why did DAU move?” panic.
  1. Trust checks on a new ranking feature
  • I built guardrail metrics: crash rate, time spent, hides, reports.
  • I tracked click-through rate and 1-day return rate.
  • When a pipeline broke (Airflow job failed on a Monday), I wrote a quick fix and backfilled data.
  • I learned to leave notes in the wiki, so no one else would chase the same bug at 9 p.m.

You know what? The hard part wasn’t the math. It was asking better questions.

Tools I touched daily

  • SQL with Presto and Hive
  • Python (pandas, NumPy, matplotlib, seaborn)
  • Jupyter notebooks
  • Superset dashboards
  • Airflow for pipelines
  • Git for reviews
  • PyTorch (light use, for a tiny prototype, not production)

I also saw Llama talk everywhere, but my work was classic DS: metrics, tests, decisions.
If you enjoy geeking out on signal noise in any form—radio or data—you might appreciate the propagation visualizations over at vhfdx.net, a fun reminder that every system has its own patterns to decode.

A normal day (well, kind of normal)

Mornings were quiet. I’d check a metric board. I’d peek at open tests. Then standup.
After that, I paired with a PM to shape a “what if we throttle X?” idea. I wrote queries. I cleaned data. I made way too many charts. In the afternoon, I met my mentor for a 30-min 1:1. We talked numbers and also feelings. Sounds odd, but it helped.

Some days got meeting-heavy. Some days I coded for hours. When a launch hit, it got busy. Like “late Slack, cold fries” busy. Not every day. But it happened.

Weekends were my reset button—I’d sometimes shoot down to the Los Angeles area for a change of pace, and if you’re in that neck of the woods looking to streamline your social plans instead of spending all night swiping, the local rundown at Skip the Games Pasadena breaks down vetted spots and practical safety tips so you can meet people quickly and focus on having fun without the usual trial-and-error.

Those screens were not just for dashboards; they also became lifelines for human connection outside work. For anyone juggling a long-distance relationship, video chat platforms can double as date-night venues—if you’re curious how to keep things playful and safe, this step-by-step primer on Skype sex lays out boundaries, tech checks, and creative prompts that help turn an ordinary call into an intimate experience.

Daily routines differ by office, of course—if you’re wondering how a New York placement stacks up, this recap of a Data Science internship in New York captures that vibe.

What surprised me (and what didn’t)

  • Data trust mattered more than speed. If the metric is wrong, nothing else matters.
  • People were kind. Reviews were blunt, but fair.
  • The first two weeks felt slow. Getting access took time. Then it was fast. Very fast.
  • Docs saved me. I left breadcrumbs in our wiki so I wouldn’t trip twice.

Honestly, I thought I would do more machine learning. I didn’t. And that was fine. The business questions were fun.

The good stuff

  • Real impact. That notification test shipped, and people saw it.
  • Strong mentorship. My manager gave clear notes, not vague fluff.
  • Good tooling. Presto is fast. Superset is simple and enough.
  • Culture felt open. If I pinged someone, they answered. Even a director once. Wild.

The rough edges

  • Onboarding drag. Access gates slowed me down at first.
  • Some pipelines flaked. Debugging took time.
  • Privacy reviews were slow. Needed, yes. But slow.
  • Context overload. So many metrics. Names start to blur.

I’ll say this twice because it matters: write things down. Future you will thank you.
Want a peek at how these pain points look outside Big Tech? One candid take on the Costco Data Science internship offers an interesting counterpoint.

Results that mattered (to me)

  • That 8% drop in creator complaints? I’m proud of that.
  • The Brazil forecast cut surprise moments for leadership during a key push.
  • I got a return offer. I didn’t expect that going in. I said yes later, after I caught my breath.

Who would enjoy this role

  • You like SQL and product questions.
  • You enjoy tests and tradeoffs.
  • You don’t mind messy data.
  • You can explain a chart to a PM in one minute, no fluff.

If you want pure ML research all day, you may feel restless. This is product work. It’s shipping choices.

To see how the hiring process can unfold from initial reach-out to final offer, Lindsey Gao lays out her own recruitment journey to a Meta Data Science internship in vivid detail.

Tips if you’re applying

  • Know SQL cold: joins, windows, cohorts.
  • Practice A/B test reads: lift, p-values, guardrails, and power.
  • Build a mini dashboard (Superset, Tableau, or even a clean notebook).
  • Tell one story: problem → method → result → learnings. Keep it tight.
  • In the interview, say what you’d measure and why. Then say what would break it.

For a blow-by-blow account of the actual screening and onsite loops, you might find this first-person review of the Meta Data Science interview pretty helpful.

Another handy resource is the crowdsourced list of Meta Data Science Intern interview questions on Glassdoor.

Small bonus: bring a short, real example. I used a “reduce notification spam” story from a college app. It showed I knew levers and tradeoffs.

Final call: would I do it again?

Yes. 9 out of 10. It wasn’t perfect. The waits bugged me. Some days felt like metrics soup. But I learned a lot. I shipped things that helped people. And I felt heard.

If you land this role, breathe. Ask one more question than you think you need. Then ship one simple thing that really moves a needle. That’s the job. And it’s a good one.