By: Glen Ogden, Regional Sales Director, Middle East at A10 Networks
Big data is no
longer an industry buzz word - we are very much at the ‘end of the
beginning’ for big data, and that introduces real uncertainty, and
opportunity. This is particularly true for banking and financial
institutions which are coming under severe pressure to shift focus from
products to customers. Understanding customer behavior as well as social
and market trends will be key to the ability of financial institutions
to retain customers and grow market share. Today, financial service
organisations are leveraging big data analytics for strategic and
competitive gain, to help transform processes and operations and –to
identify new business opportunities.
Roughly three
quarters of organizations that haven’t already deployed big data
solutions appear to either have pilot schemes in place or are well in to
the planning process- One of the main challenges facing new entrants is
the lack of publically available use cases and reference architectures;
those organisations that have successfully invested in big data to
optimize their workflows may keep details closely guarded for the time
being to maintain a competitive advantage.
Use Cases
Think about a
banking network with millions of customers, each with a different
activity profile and set of ‘normal’ or expected actions. This brings
into focus a number of complex variables that need to be weighted,
classified and correlated.
Using big data
analytics, banks can harness all the historical data to model customer
preferences. The results can then be used to personalize event based
marketing campaigns for new products and services. When coupled with a
coordinated messaging across email, mobile, branch and ATM interactions,
these targeted, personalized marketing campaigns have a much higher
probability of conversion than traditional mass email campaigns.
With cashless
transactions becoming the norm, fraud is another big issue. Banks needs
to continuously monitor client behaviour for anything anomalous. This
is done by monitoring the time, geolocation, transaction amount,
transaction frequency, items purchased and then mapping the behaviour
against a template of what ‘normal’ looks like for that customer. Bear
in mind that ‘normal’ for December may be very different from ‘normal’
in July. Spatio-temporal problems like this are non-trivial, and solving
them requires highly efficient processing at scale. With data streaming
in thick and fast and potentially large financial transactions at stake
we ideally want to detect anomalies accurately and within a small time
window. Accuracy here means not stopping valid transactions (false
positives), and not allowing fraudulent transactions (false negatives).
The problem of
minimizing false positives and false negatives is a notoriously hard
problem in computer science, typically requiring a blend of statistical
and computational intelligence techniques and frequent training and
tuning. Insights gained from the massive datasets processed by big data,
together with new anomaly detection methods employed with big data are
likely to really help optimise processes here.
Storage
Perhaps the biggest challenge introduced by big data is the need to re-evaluate the storage-compute model.
The biggest
benefits of big data are reaped by organisations that have a lot of
legacy data. This could require moving historical data, integrating with
that data, or un-archiving that data from long term storage. This has
implications for traffic management, security, data handling, and
storage.
Financial
institutions may be reluctant to move sensitive data off premise, Many
organisations cannot afford to build and tear down datacenters to handle
their processing and storage scale demands, nor do they have the
agility needed to deal with rapidly changing high volume unstructured
datasets. Cloud Computing is fast becoming a keystone in our thinking
about the way we architect data centres, One can foresee institutions
deploying hybrid cloud solutions at both a strategic and tactical level
to handle big data tasks, perhaps anonymising data or covering
regulatory concerns through service level and data confidentiality
agreements.
Internet of Things (IoT)
We can’t mention
big data without also mentioning the Internet of Things (IoT). By 2020
various industry estimates put the number of internet connected devices
between 50 and 75 billion. This is going to radically change how humans
interact with technology, the visibility we have on the state of these
‘things’, and the insights gained from analytics on those ‘things’.
In practice,
this will result in the generation of much higher volumes of
unstructured data (through instrumentation, external feeds, etc). All
this data will need to be stored in the enterprise data centres and
analyzed using big data solutions – something that needs to be
considered and factored in to future IT planning.
Security
Big data is relatively new; it has only been a decade since Google published the seminal MapReduce white paper, and as with any new technology the primary concern is functionality.
This introduces a number of security challenges, not only in the secure
handling and storage of the data, but in understanding the nature of
the data itself, and how it can be manipulated to create insight (and
potentiality breach confidentiality policy).
At the most
basic level, big data components may include only rudimentary access
control and integration with systems such as Kerberos, and depending on
the components you choose, may introduce additional vulnerabilities when
mapped against a mature security framework. It’s also important to
determine how long to keep this data and how to ensure that data
integrity is maintained (over potentially many years). With big data
there may simply be a lot more data, but the scope of it may also be
much broader, and it is likely to be more granular as the drive to
instrument everything continues. These remain important concerns;
especially in the heavily regulated financial services industry.
Fortunately there is a considerable effort in mitigating these
challenges through conventional security techniques as well as emerging
technologies such as block chain.
Seizing the Opportunity
Financial
service organisations have no physical product to sell. Data, and the
associated workflows are business-critical assets. Big data offers the
promise of real differentiation for early adopters, especially where
competitive advantage can be opportunistic and short lived. The most
effective strategy for big data adoption will be to identify core
business requirements, and then leverage existing infrastructure as part
of a phased migration, ideally taking a specific project as a proof of
concept in order to build up the necessary data science skills, assess
deployment, storage and archiving models and address regulatory and
security concerns.
