As corporations like Carousell push extra reporting into cloud information platforms, a bottleneck is exhibiting up inside enterprise intelligence stacks. Dashboards that when labored tremendous at small scale start to decelerate, queries stretch into tens of seconds, and minor schema errors ripple in experiences. Briefly, groups discover themselves balancing two competing wants: steady govt metrics and versatile exploration for analysts.
The strain is changing into widespread in cloud analytics environments, the place enterprise intelligence (BI) instruments are anticipated to serve operational reporting and deep experimentation. The result’s typically a single setting doing an excessive amount of – performing as a presentation layer, a modelling engine, and an ad-hoc compute system without delay.
A current structure change inside Southeast Asian market Carousell reveals how some analytics groups are responding. Particulars shared by the corporate’s analytics engineers describe a transfer away from a single overloaded BI occasion towards a break up design that separates performance-critical reporting from exploratory workloads. Whereas the case displays one organisation’s expertise, the underlying drawback mirrors broader patterns seen in cloud information stacks.
When BI turns into a compute bottleneck
Trendy BI instruments permit groups to outline logic straight within the reporting layer. That flexibility can pace up early improvement, but it surely additionally shifts compute stress away from optimised databases and into the visualisation tier.
At Carousell, engineers discovered that analytical “Explores” have been regularly linked to extraordinarily massive datasets. In keeping with Analytics Lead Shishir Nehete, datasets generally reached “hundreds of terabytes in size,” with joins executed dynamically contained in the BI layer, not upstream within the warehouse. The design labored – till scale uncovered its limits.
Nehete explains that heavy derived joins led to sluggish execution paths. “Explores” pulling massive transaction datasets have been assembled on demand, which elevated compute load and pushed question latency greater. The workforce found that 98th percentile question occasions averaged roughly 40 seconds, lengthy sufficient to disrupt enterprise evaluations and stakeholder conferences. The figures are primarily based on Carousell’s inner efficiency monitoring, which was offered by the analytics workforce.
Efficiency was solely a part of the problem: Governance gaps created further danger and builders might push modifications straight into manufacturing fashions with out tight checks, which helped function supply however launched fragile dependencies. A tiny error in a subject definition might trigger downstream dashboards to fail, forcing engineers to carry out reactive fixes.
Separating stability from experimentation
Slightly than proceed to fine-tune the current setting, Carousell engineers selected to rethink the place compute work ought to dwell. Heavy transformations have been transferred upstream to BigQuery pipelines, the place database engines are designed to carry out massive joins. The BI layer shifted towards metric definition and presentation.
The bigger change got here from splitting obligations in two BI cases. One setting was devoted to pre-aggregated govt dashboards and weekly reporting. The datasets have been ready prematurely, permitting management queries to run towards optimised tables as a substitute of uncooked transaction volumes.
The second setting stays open for exploratory evaluation. Analysts can nonetheless be a part of granular datasets and check new logic with out risking efficiency degradation of their govt colleagues’ workflows.
The twin construction displays a broader cloud analytics precept: isolate high-risk or experimental workloads from manufacturing reporting. Many information engineering groups now apply related patterns in warehouse staging layers or sandbox tasks. Extending that separation into the BI tier helps keep predictable efficiency below development.
Governance as a part of infrastructure
Stability additionally trusted stronger launch controls. BI Engineer Wei Jie Ng describes how the brand new setting launched automated checks by way of Looker CI and Look At Me Sideways (LAMS), instruments that validate modelling guidelines earlier than code reaches manufacturing. “The system now automatically catches SQL syntax errors,” Ng says, including that failed checks block merges till points are corrected.
Past syntax validation, governance guidelines implement documentation and schema self-discipline. Every dimension requires metadata, and connections should level to authorized databases. The controls scale back human error whereas creating clearer information definitions, an essential basis as analytics instruments start so as to add conversational interfaces.
In keeping with Carousell engineers, structured metadata prepares datasets for natural-language queries. When conversational analytics instruments learn well-defined fashions, they’ll map person intent to constant metrics as a substitute of guessing relationships.
Efficiency good points – and fewer firefights
After the redesign, the analytics workforce reported measurable enhancements. Inside monitoring reveals these 98th percentile question occasions falling from over 40 seconds to below 10 seconds. The change altered how enterprise evaluations unfold. As a substitute of asking if dashboards have been damaged, stakeholders might think about evaluating information dwell. Simply as importantly, engineers might shift away from fixed troubleshooting.
Whereas each analytics setting has distinctive constraints, the broader lesson is easy: BI layers shouldn’t double as heavy compute engines. As cloud information volumes develop, separating presentation, transformation, and experimentation reduces fragility and retains reporting predictable.
For groups scaling their analytics stacks, the query isn’t about tooling alternative however round architectural boundaries – deciding which workloads belong within the warehouse and which dwell in BI.
See additionally: Alphabet boosts cloud funding to satisfy rising AI demand
(Photograph by Shutter Pace)

Need to be taught extra about Cloud Computing from trade leaders? Try Cyber Safety & Cloud Expo going down in Amsterdam, California, and London. The great occasion is a part of TechEx and is co-located with different main expertise occasions, click on right here for extra data.
CloudTech Information is powered by TechForge Media. Discover different upcoming enterprise expertise occasions and webinars right here.



