Make Decisions Faster For Competitive Advantage

Posted on July 11, 2017

Dc Row

There’s been a lot of hype about the opportunities offered by Big Data – so how can it realistically offer your organisation competitive advantage? And how do you get your ICT infrastructure ready to take advantage of these opportunities?

Data Analytics is at the top of many CIO agendas. Organisations of all kinds are actively searching for new ways to uncover underlying operational or customer data, harvest it and then analyse it to support better decision-making, identify trends and inform forward planning.

The promise of Big Data and the Internet of Things (IoT) are becoming a reality, and any ICT strategy must cater for the collection and transfer of increasing quantities of data – as well as the applications that will enable faster and more informed decision-making to maximise speed to market and competitive advantage.

Leaving aside the many applications for benefitting from Big Data and the IoT – and these vary according to each organisation and industry sector – what should you be considering now when planning your ICT infrastructure of the future?

Data is quickly becoming the most important asset in the enterprise. What does it mean for the CIO who is the custodian of this most valuable asset? As Milton Friedman once famously quipped: “The duty of managers is to maximize shareholders’ returns.” Hence it is the duty of the CIO to maximize the return on data.

Implications for your network

If you are planning any Big Data initiatives or business analysis applications, then your existing network could let you down. First ask yourself, “What role will our network play in the success of these innovations?” Specific questions to answer include:

Where is the source data we need to support decision-making?

If the data you need to analyse is distributed across your operations, you need to ensure it can be transferred across your network to the place of processing. This means ensuring the network ‘pipes’ carrying your data are adequately provisioned for high speed transfer between locations.

Will we need to get it there in real time or can we wait an hour (or a day)?

If real-time reporting is essential to the nature of your Big Data or analysis application, these network pipes will need to be lightning fast. It also means the data must be continuously flowing as it is created or amended, rather than ‘batched’ into intermittent sends. This might also involve prioritising the data within your network pipes to minimise latency.

If your pipes are not up to this task, your users will not be making decisions on the latest data and may experience performance issues in using applications – such as delays in uploading dashboards or graphs.

Implications for your data centre

Any new initiatives involving Big Data, data from the IoT or enterprise-wide decision support applications will put pressure on your storage and processing systems. So another question to ask is: “Will our existing Data Centre cope?” You will need to engage in some detailed capacity planning to make the best decision on how it can be optimised – or whether you need to look beyond it for a better solution:

How much more data will we need to store?

Once you’ve determined the answer to this, the next questions are: “How often will we need to back it up or replicate it, and how long will we need to store it for?” Whatever the answers, it is likely that you will need to completely re-think your storage strategy – and probably replace your existing storage systems.

How much processing power will we need to crunch and deliver all this data?

Many Big Data, IoT and DSS/BI solutions – especially those collecting, analysing and reporting data from multiple sources, locations or applications in real or near real time – can only be effective with the deployment of a specialised high performance server. Many of these have High Density (HD) power requirements – in other words, more power than can be delivered to a typical equipment rack. HD racks, which precious few in-house Data Centres can support, also call for greatly increased power redundancy and backup resources.

If you find that your existing Data Centre facility is not up to the task – or upgrading it involves considerable capital investment – the next logical question is:

Would a co-location or cloud solution be less costly in the long run?

The information you are collecting, analysing and reporting on is core to your business. But transporting, storing and processing it is not. Over time, the amount of data – and thus the power needed to process it – will increase.

If you find that the required capital investment in your Data Centre to ready it for faster business decision making is significant, it may be time to think about a pay-by-use Cloud model. Transferring business costs from CapEx to OpEx – only paying for what you need, when and only as you need it – usually makes sound business sense. Plus, you can benefit from economies of scale and gain access to high performance, highly available Data Centre facilities that would otherwise be unjustifiable.

Getting it right

Planning for the exploitation of Big Data or the IoT for new ways of harvesting, analysing and acting on internal or external sources of data is a significant challenge, calling for specialised expertise. This challenge also demands re-thinking the ways data is stored, where it’s located and how it is transmitted throughout your organisation – taking all the necessary security and compliance risks into consideration.

Any new data-related business initiative needs to be carefully considered from all angles – and can benefit from expert advice right from the initial planning stage.

Related products & articles