Short and sweet full-blown guide for building what the title suggests.
Introduction
Who is this for
This article aims to help mobile game developers who are:
Building their own data analytics stack on a budget
New to data analytics and want to implement best practices without hiring external experts
Launching a game or app and need to understand user behavior to make informed decisions
Looking to set up a good base for future-proof analytics solution
What you will get
This article is the first one in a series about how to create a nearly free mobile game analytics stack (a stack that can be expanded later).
In this article we will go over the motivation behind owning a custom data analytics stack, a step-by-step guide on how to put together the readily available tools like Firebase, BigQuery, and Looker Studio from Google and best practices.
By following this article, you can develop a personalized set of KPI dashboards that you own and can build upon in the future, expanding with game, economy, live ops and marketing dashboards (which will be covered in next articles in the series).
Additionally we are preparing shorter helper articles that will be linked throughout to help you with the terminology (glossary) and specific details (primer into BigQuery & SQL, overview of Looker Studio capabilities, etc).
Who are we (authors)
Numu Collective is a network of experts who are building a resource hub for scaling and launching games & apps. Our goal is to help small to mid-size studios and publishers who have limited resources to learn and use the right tools so they can compete with larger companies in today's super competitive market.
Authors of this article
Jakub Marek - Director @ Numu Labs and Founder @ Numu Collective
David Kadlub - Senior data engineer @ Pocket Worlds
Robert Magyar - Growth lead @ Fingersoft
We have combined 25+ years of experience in creating data analytics solutions that were used to gather and analyze data from more than 2.5 billion players across more than 100 games big and small games from companies like EA, LEGO, Fingersoft, Boombit, Dapper Labs, NimbleBit, Big Fish Games, HypeHype and others.
We have experience in architecture and technologies in data warehousing (Google Cloud, AWS, Snowflake), data processing (DBT, …) and data visualization (Looker and Looker Studio, Tableau, Sisense). Additionally we have developed solutions based on end-to-end platforms (Amplitude, Playfab, Game Analytics, Mixpanel, Bloomreach, Devtodev, …).
Motivation
You can’t improve what you don’t measure.
In today’s fast-paced gaming world, everyone should understand that analytics are no longer just nice to have—they’re a must.
Reliable data lets you spot which features really hook players, find out why people drop off, and figure out how to fine-tune your monetization. With data, you can make better tweaks to your game, boost player retention, and grow your revenue—without ever having to guess what might work.
Where to start
The world of analytics solutions is quite mature and ton of different solutions exist for every individual step. There are complete end-to-end platforms (such as XXX), there are multiple choices for data warehousing (such as XXX), for data collection (such as XXX), for data visualization and reporting (such as XXX).
One of the simpler ways to get started is to use connected tools from the Google ecosystem. Why Google? Basic functionalities of the products we are going to discuss are free, there are a lot of resources for learning available and Google’s products connect to each other out of the box which will minimize the upkeep in the long run.
Also you can be fairly sure that you will be using tools that will have support for years to come (albeit it is true that even Google has deprecated some well liked products but compared to using third party tools by lesser known companies the risk of deprecating the tools we are discussing is negligible).
To get to a working analytics solution with useful reports we need to start from collecting data. Most games today already implement Google Firebase, it is free and offers a ton of useful features especially for smaller teams ([add list]).
One of the key components of Firebase is data collection through Firebase SDK that allows for out-of-the-box reporting within the Firebase web app. While the Firebase’s default dashboard and reporting is good for quick starts, it can quickly become limited especially if you are looking
Customization of reports
In-depth analysis of users or events
Custom and more complex filters
Non-standard charts (e.g. funnels) or cohorted metrics
As such most developers over time find they need to expand beyond the out-of-the-box functionality. One of the possible solutions if we can stay within Google's ecosystem is to
Leverage Firebase’s ability to collect data and push it to Google BiqQuery
Use BigQuery’s query engine to process the data into desirable format
Utilize Google’s Looker Studio for data visualization and reporting.
Different tiers of setup complexity
Let’s quickly go over what are our options when it comes to using Firebase for analytics:
Basic Firebase Dashboard
Firebase web application offer some pre-made reports
Works out of the box just with the SDK integration
Google Analytics for Firebase
Firebase data can be further visualized in Google Analytics
Again should work with minimal setup
GA has more powerful UI and offers more options how to slice the data, but is still limited (e.g. difficult to access custom user and event parameters)
Custom solution
Exporting data from Firebase to BigQuery + processing data in BigQuery + building custom dashboards in Looker Studio (the focus of this article).
This offers almost complete freedom in what you can create and allows you to expand with more advanced use cases later.
This will be the scope of this article.
Use cases for the analytics stack
When we are thinking about future-proof analytics stack we should define the use cases and respective solutions
KPI dashboard - focus of this article
Use case: Reports with essential KPIs covering in high-level the user journey - acquisition, retention, engagement and monetization
In this article we will focus on the basic setup which will let you analyze the monetization data coming from IAPs
Implementing the Ad revenue is more advanced topic that we will expand on in the following article - stay tuned
Game dashboard - future article
Use case: Going beyond the traditional metrics requires the collection of custom in-game user and event parameters
While similar principles to those we are implementing in this article, there is a lot more topics we will need to cover so stay tuned.
Marketing dashboard - future article
Use case: When you are ready to scale your game using paid user acquisition you will need additional functionality and introduce marketing spend, ROI (profitability) in the metrics & charts
In most cases this will require an integration of additional solutions (usually an MMP for attribution such as Appsflyer or Singular)
As with game dashboards the complexity is higher and with the addition of metrics, the attribution logic and complications due to the privacy framework by Apple (SKAN) this requires a standalone article - stay tuned
Additional link: Link to a [Helper Article: Glossary of Terms & Concepts] for readers unfamiliar with concepts like “nested data,” “cohort,” or “SDK.”
Prerequisites & Goals
Prerequisites
While this guide is aimed at beginners we expect you to already have Firebase SDK integrated in your mobile game
A mobile game (or test project) with Firebase SDK integrated.
Basic familiarity with Firebase Console.
Access to Google Cloud (BigQuery) and Google Looker Studio.
End Goals
Have a “flattened” events table in BigQuery.
Produce a “Daily User State” table (or view).
Create a Looker Studio dashboard showing daily KPIs (DAU, revenue, retention, etc.).
Step-by-step guide
1. Setting Up the Firebase
Implement the Firebase SDK
Add link to the google documentation.
Making sure that you are collecting the data - check the realtime analytics dashboard (or streaming insert to BigQuery (next step)).
Other option: Debug view to check if you are collecting data. (more complicated -> helper article)
Enabling BigQuery Export
Step-by-step: in Firebase Console → Project Settings → Integrations → Link to BigQuery.
URL: https://console.firebase.google.com/project/{{YOUR_PROJECT_ID}}/settings/integrations/bigquery
Confirm daily or streaming export options.
Check if the data is pushed into BigQuery (depending on the option) - this might take up to 24 hours in case of daily exports
URL: https://console.cloud.google.com/bigquery?project={{PROJECT_ID}}
Beware: Data will start flowing into BigQuery only after enabling the export, there is no way how to export historical data.
Tip: Link to a [Helper Article: “Firebase-BigQuery Export: A Quick Tour of the Raw Schema”] with sample screenshots of the nested fields for those who want deeper detail.
2. Preparing the Data for Visualization
Understanding the schema of raw Firebase data in BigQuery
In BigQuery the Firebase data is available in the main dataset. Nested structures (arrays) containing parameters, user properties, etc.
Emphasize that this is the reason we’ll do a flattening step next.
Creating an Events_Flat Table
Why Flatten?:
Nested data can be harder to query and debug, especially for beginners.
Flattening makes each event one row, with top-level columns for key event parameters.
Essentially flattened data works like standard Excel or Google sheet.
Note: Batch exports create a dailyf sharded table.
Sample SQL Query to Flatten Events:
Provide a simple, annotated query that pulls from the raw events_YYYYMMDD or events_intraday table and writes to a new “events_flat” table.
Highlight important fields: event_name, event_timestamp, user_pseudo_id, platform, country, app_version, event_value_in_usd (if applicable), etc.
Mention scheduling a recurring query or using a script to run it daily.
In the future article we will go over methods on how to flatten tables with user and event parameters automatically.
Building Aggregated Tables: Daily User State
Purpose of Daily Aggregates:
Each user gets one row per day.
Similar to creating a pivot table with dimensions of user id and activity date
Platform and country are descriptive fields (it’s not a breakdown as we want to have only one country and platform per user per day = per row)
This also brings questions about what to do about multiplatform users or users who play across different countries in a single date = helper / future article
Makes it easy to compute daily active users, daily revenue, sessions per user, etc.
Key Fields in Daily User State Table:
Must have:
Date (the activity date)
user_pseudo_id (unique user ID from Firebase) => Helper article: Working with users and how to choose the right user ID (proper DAU if users log in with multiple devices etc - devices vs users)
platform (iOS/Android)
country
app_version
first_open_date (for cohort analysis)
first_app_version (
Naive solution - process all flattened data everyday to get first value for each user
Better more advanced solution (if you have more data)
Incrementally processing the data
Daily_iap_gross_revenue
Daily_iap_net_revenue
Naive solution - discount the gross by store fees (either 15% or 30% based on your current revenue, check with GP or iTunes)
More advanced solution - discounting VAT, requires lookup table with VAT per country so proper VAT can be further deduced from the revenue
daily_iap_purchases_count
daily_sessions
daily_session_duration
Optional or more advanced:
Ad revenue fields (automatically working with AdMob).
ad_revenue
Only for AdMob - for other Mediations (if used) they can provide user level data that can be combined with Firebase data directly after User ID is implemented the same way.
ad_impressions
Subscription fields (not available in Firebase data by default, custom integration needed)
subscriptions_started/renewed/cancelled/upgraded/downgraded
Since subscriptions are triggered from GP or iTunes the data is not accurately available on client and as such requires a custom more advanced solution (beyond the scope of this article = future article)
subscriptions_revenue
Sample SQL to Generate Daily User State:
Show how you group by user_pseudo_id and date, summing up event counts, revenue, etc.
Mention partitioning the output table by date to manage query costs efficiently.
Tip: Link to a [Helper Article: “Daily Aggregations in BigQuery”] with more advanced partition/clustering tips.
3. Creating Your First KPI Dashboard in Looker Studio
Now you have two options, you can either create your own dashboard from scratch or you can use our Basic KPI dashboard template (recommended).
Option 1 (using the template)
[Template Link] to a sample Looker Studio report that readers can copy and modify with their own data source.
Steps
Make a copy of the KPI Dashboard template.
In your own copy, connect Looker Studio to BigQuery:
Walk through the data connector process in Looker Studio.
Choose your “daily user state” table as the source.
Dashboard should come alive with your own data
You can change, edit and add new stuff as you like.
Best practices applied in our template:
Filtering
Breakdowns
Structure of dashboards (per use case vs exploratory)
Beware: Default currency used will be also used in
If you need currency conversion additional step is needed
Tips for Sharing & Collaboration:
Inviting team members.
Setting up viewer-only vs. editor roles.
Scheduling email reports or PDF exports.
Link: Include a [Template Link] to a sample Looker Studio report that readers can copy and modify with their own data source.
4. Interpreting the Data
High-Level KPIs to Monitor:
Daily Active Users (DAU), Daily New Users, Day-1/7 Retention, Revenue, etc.
Encourage readers to track these metrics and explore correlations (e.g., changes after a new app version release).
Common Pitfalls:
Tracking too many events without planning.
Not verifying that data is accurate before building dashboards.
Overcomplicating early dashboards—start simple, then iterate.
Cost of the Solution
$300 credits for new projects
Brief mention: There may be minimal costs (BigQuery storage/queries), but for small-to-medium volumes, it’s often free or very low.
Conclusion
Summary
We have set up a robust pipeline: Firebase → BigQuery → Flattening → Daily Aggregates → Looker Studio.
Next Steps
Future Enhancements:
A/B testing frameworks (e.g., Firebase A/B Testing + BigQuery analysis).
Churn prediction or LTV modeling with BigQuery ML.
Integrating cost data from UA channels for full ROI calculations.
Questions & Feedback
If you have questions about or feedback to the article please send over email at jakub@numucollective.com or reach me out via Linkedin.