Downer’s Data with IBM
When it comes to city transportation, accuracy is essential. As one of the top providers of integrated services in Australia and New Zealand, Downer sees itself as a protector of the complex transportation network and is always looking for ways to improve its efficiency. Downer has amassed a massive quantity of data with more than 200 trains and numerous sensors.
Even though Downer’s routinely extracts useful insights from its data, its collaboration with IBM Client Engineering sought to go deeper into the dataset’s possibilities, with a particular emphasis on tools and techniques that can simplify their analytical procedures.
Refining data insights with increased cooperation and knowledge
Downer is constantly striving to improve operational effectiveness while reducing maintenance expenses and disruptions to transportation plans. However, Downer’s envisioned a collaboration that would enable them to go through the data more thoroughly and find more insights that could improve operations. They joined forces with IBM specialists to explore the ever-expanding realm of data science. IBM Client Engineering ensured a valuable partnership by implementing Lean Startup, pair programming, and the IBM Value Engineering Method.
IBM cross-team involvement, enablement, and support with customer success managers, IT specialists, and IBM Consulting bolstered this team effort. With two sprints and a series of workshops, Downer and IBM set off on an innovative and exploratory journey. The companies worked together to investigate the prospect of increasing Downer’s capacity with the goal of detecting transportation abnormalities four times faster than they were four weeks earlier.
“This is a step towards potential transformation understanding what other tools and methods can drive deeper insights,” says Downer’s Lead Data Scientist, Yin Wong.
Low-code and no-code approaches for the investigation of time series data
With IBM’s introduction, the field of no-code and low-code options for constructing predictive models to yield quicker insights was opened up. With Watsonx.ai technology (deployed on IBM Cloud), Downer was able to reduce modeling times and increase anomaly detection efficiency.
Using advanced time series forecasting and anomaly prediction techniques along with IBM AutoAI and SPSS Modeler flows, Downer was able to quickly develop predictive models that can be integrated into business operations to improve decision-making processes.
In doing so, Downer’s surpassed their target period of one week to identify defective sensors, a substantial reduction in time. Presenting patterns to improve and refine data analysis while making it accessible to individuals without expert topic expertise was the major goal, rather than revolutionizing the method.
Downer’s was able to improve their data research procedures and experiment with various models thanks to the solutions, highlighting:
Versatility: Downer was able to guarantee continuous data access while integrating a variety of data sources.
Speed and scalability: Downer used tools like the Jupyter Notebook to effectively adapt models to a variety of data signals.
User-friendly exploration: Even for individuals without a strong background in data science, Downer found the model recommendations to be clear and beneficial in producing insights.
Employees at Downer’s express great satisfaction with the collaboration. Numerous analytics possibilities are available with these instruments. Yin Wong explains, “It’s about leveraging what best fits our team’s needs and skill set.
Additionally, Kes Gooding, RTS GM of Technology at Downer, acknowledged the possibility for additional operational improvements and hinted at a more thorough examination of the recommendations.
Together, navigating the data landscape
With this partnership, Downer and IBM Client Engineering demonstrated the possible results that can arise when two massive enterprise companies work together to achieve common objectives and improve data insights. They revisited the significance of ongoing exploration in the current digital era by delving into the vast lands of data-driven decision making over the course of four weeks.