Quantcast
Channel: Government – Fujitsu UK & Ireland Blog
Viewing all articles
Browse latest Browse all 11

Navigating the use of data analytics in the public sector

$
0
0

In recent times, much has been made about the rise of data analytics and how it will revolutionise and revitalise businesses through a magical new term called “insight”.

We see the digital giants and mega-vendors building increasingly elaborate systems to give us a more convenient and intimate personal user experience. The use of predictive analytics, cognitive science and robotic process automation are all now commonplace in the data-driven world.

In the private sector, there’s currently a big drive towards hyper-personalisation of services as a means to attract and retain your custom. It’s now less about what is trending and more about what information matters to you on an individual level.

Yet in the public sector, there’s nothing new about hyper-personalisation – it’s practically part and parcel of what the sector exists to deliver.

Public benefits

If you think about it, there’s not much the public sector doesn’t know about you at an individual level.

It knows when and where you were born, who you married, your occupation, driving details, car ownership, house ownership, schooling, household income, medical records… you get the idea.

Given the rich stream of data that the public sector can access, surely the insight it can gain from this would be revolutionary?

Well, it’s not that simple. Legal and privacy regulations apply to government too, together with a strong duty of care over people’s most sensitive information.

Despite this restriction, the public sector is using data analytics in a variety of ways to help improve the services whilst driving down operational costs.

More data than ever

In data analytics, the hard part is the “data”. There’s more of it than ever before, more than you can process, all at different levels of granularity and moving at different speeds.

You can no longer simply capture everything and hope to analyse it later. You need the ability to make decisions in real time.

Within central government, two different strategies for delivering data analytics have emerged.

The first is the more traditional approach of building an internal data capability from the infrastructure through to the people. This has the advantage of control and is suited to organisations whose data is often highly sensitive or inappropriate for sharing.

The second approach has been to embrace the open data movement and expose department data openly via datasets or API-based access.

This gives rise to an ecosystem of suppliers who develop digital services and applications based on the data made available. This in turn reduces the need for that department to maintain large and costly internal data analytics and software development practices.

Data.gov.uk now publishes more than 40,000 datasets for open use – the largest in the world.

A force for good?

But should we trust the public sector with our information? Will data analytics be used as force for good, or is it an enabler for the so-called ‘Orwellian state’?

As the societal megatrends of urbanisation and an ageing population place an increased load on public infrastructure and services, predicting demand is going to be critical to managing the future.

Making sense of the constant stream of data and interpreting it correctly will be pivotal to our economic success as well as our safety.

Of course, there’ll be people who see data analytics as an invasion of privacy and think that their data is being exploited by the government to inform policies and schemes that will likely be to the detriment of their circumstances. But this challenge existed long before mass data analytics.

Nonetheless, the opportunities for data analytics in the public sector continue to expand; from fraud and error capabilities through to traffic flow management and flood prediction.

Traditional data analytics has been centred on the descriptive and diagnostic variants of the discipline.

More recently, however, public sector organisations are increasingly seeking to understand what is likely to happen and then take pro-active measures and interventions. This approach, coupled with the rise of big data and cloud computing, is now making predictive and prescriptive analytics more affordable and practical to realise.

Enabling technology

As the volume, variety and velocity of data start to exceed the human ability to extract meaning from it, the use of narrow artificial intelligence (or ‘weak AI’) and machine learning is set to become the new norm.

The resurgent use of the terms ‘artificial intelligence’ and ‘machine learning’ is fuelling media scare stories about the rise of machines – and impending human demise.

A far less dramatic view is that AI and machine learning are not especially new. They’ve been around for decades – but why are we talking about these terms now?

Well, AI is very compute-intensive and until recently it had been an expensive endeavour to build discrete and narrowly-focused AI capabilities.

The relatively recent growth of hyper-scale cloud services has brought down the cost and risk of computing to drive a renaissance in the field. The real value of AI is not in a machine’s ability to learn but in the algorithm that guides, informs and affirms the learning. Superior algorithms will deliver better results and a competitive edge.

So should we fear the use of AI? Well, if you’ve met anyone called Siri, Cortana or Alexa then you probably don’t. It’s been here for some time already and is already enhancing our lives in ways not previously possible.

Trust issues

The biggest data challenge the Public Sector faces is our trust. Today, people readily give away information about themselves in exchange for an online music account or for the use of social media.

It is however unlikely that they would do the same for HMRC or the DWP – unless they were required to do so by law.

As news of the latest corporate data breach splashes across the headlines, people are increasingly seeking assurances about the handling of their information.

Greater understanding of the value of personal data will lead us all to be more active in managing our digital footprint and be more cautious before divulging personal information online.

New data legislation such as the EU GDPR is a sign that the business of executing data analytics will become tempered with a directive to ensure information is correctly managed, processed and protected.

It seems, then, that technology is not the problem here: technology provides us with new ways we can process, store and utilise our own personal data to transform our lives for the better.

The post Navigating the use of data analytics in the public sector appeared first on Fujitsu UK & Ireland Blog.


Viewing all articles
Browse latest Browse all 11

Latest Images

Trending Articles





Latest Images