Insight

Getting Ready for Advanced Analytics

Getting Ready for Advanced Analytics

Despite their best endeavours, organisations still struggle to move towards a data-driven culture. Transformations take time and while most aspire to being data-driven, few have realised this ambition. Certainly, technology changes are not easy, however these are rarely the blocker to adoption, with culture, decision making, and process cited as typical impediments. So how might organisations get ready for Advanced Analytics?

Don’t Forget Enterprise Architecture

Whilst the discipline of Engineering has taken a significant step forward in large organisations, good Enterprise Architecture (EA) can be used as an instrument to smooth the path to adoption. Despite its name, EA has traditionally been biased towards technology concerns at the enterprise level. However, it is undergoing a quiet revolution with continuous architecture practises starting to emerge and a broadening of the discipline from its traditional roots, to the architecture of the enterprise. A well-known Enterprise Architecture tool is the Blueprint, a tried and tested reference architecture which typically includes best practices and pre-canned decisions to be made. Modern Data Blueprints have emerged over the past few years and these can shorten decision cycles and accelerate the implementation of the technology and data pipelines to support Advanced Analytics.

Implement a Triumvirate

Advanced Analytics requires a different operating model to be put in place within the organisation, one which requires the analytics team to not just work more closely with the technology team, but also the data office. Funding cycles and portfolio planning can result in constraining the advancement of data adoption through siloed decisions being made by either of these departments. By implementing a triumvirate, collective decisions are made ensuring all three parties are advancing business imperatives together.

Consider Hybrid Deployments

Enterprises are increasingly switching back to on-premise infrastructures. This shift is attributed to the emerging dominance of the hybrid cloud model, which incorporates and balances the benefits of both cloud computing and existing on-premise data centres. On-premise servers allow organisations to capture benefits such as security, control, and performance. To compete in today’s marketplace, however, organisations need to leverage cloud applications to a greater or lesser extent as it isn’t difficult to foresee limitations with on-premises infrastructure. The public cloud can solve some of these limitations, such as integrating disparate, unconnected infrastructure. Unfortunately, the public cloud isn’t suitable for all types of workload, and the public cloud can be associated with unpredictable costs that tend to stem from limitless resources and egress charges. With a hybrid cloud strategy, both public cloud and on-premise elements work in tandem to support the entire range of needs, bringing about the primary advantage of agility under a common data management strategy that also ensures usability and reduced complexity for Analytics teams.

Design a Modular and Service Oriented Data Architecture

Many organisations are now moving towards a highly modular data architecture that uses components that can be replaced with new technologies as needed. Exposing data via APIs (also known as Services or Interfaces) can ensure that direct access to data is limited and secure, whilst at the same time provide faster, up-to-date access to common data sets. This allows data to be easily reused between teams, accelerates access, and enables seamless collaboration between analytics teams so that use cases can be developed more efficiently. An API management platform (API gateway) or Service Bus, is needed to create and publish data-centric APIs, implement usage policies, control access, and measure usage and performance. This platform also allows users to search and reuse existing data interfaces rather than build new ones. These technologies can be either embedded in the data architecture or be developed as an external standalone capability. Data pipelines and interfaces simplify integration between disparate tools and platforms by shielding teams from complexity, increasing speed to market, and reducing the opportunity to cause new problems in existing applications. These interfaces also make it easier to replace individual components as requirements change.

Establish Best Practices

DataOps is an emerging set of practices, processes, and technologies for developing and enhancing data and analytical pipelines to meet business needs quickly. As these pipelines become more complex and development teams grow, organisations need better collaboration and development processes to govern the flow of data and code from one step of the data lifecycle to another – from data ingestion and transformation to analysis and reporting. The goal is to increase agility and cycle times while reducing data defect, giving developers and business users greater confidence in data analytics output. DataOps builds on concepts that are popular in the field of software engineering, such as agile, lean, and continuous integration/continuous delivery, but addresses the unique needs of data and analytics environments. It relies heavily on test automation, code repositories, collaborative tools, orchestration frameworks, and workflow automation to accelerate delivery times while minimising defects.

Like this article? You’re one step closer to making data-driven decisions.

Reach out to us to start a conversation – we welcome an opportunity to understand your data needs, challenges and aspirations. We utilise an expert toolkit of data capabilities and services to help take guesswork out of your decision making.

See more insights

Insight

AI and Machine Learning as Analytics, let’s simplify!

Abdul Rabbani Shah

Insight

Working in Analytics: A Consultancy Perspective

Ehab El Badrawi

More insights Contact us