‘Big data’ system to aid banks

Why in the news ?
The government is testing a new system that will assist banks in assessing credit risk and the probability of fraud using big data analysis.
More on news
  • Currently, rural and cooperative banks depend on judgement of the bank manager, resulting in high NPA
  • In order to tackle the issue of rising non performing assets (NPAs), the system is expected to help lenders, particularly rural and cooperative banks.
  • According to the Ministry of Electronics and IT, a credit rating model has been developed “that can assist the rural and cooperative banks to quantify risks under the big data context.
  • The ministry sponsored project includes as partners the Reserve Bank of India (RBI), Bangalore­-based IT firm Processware System and two cooperative banks.
  • The project is aimed at helping banks quantify risks associated with retail loans such as gold loans, personal loans and vehicle loans. 
  • Under the project, a statistical and machine learning algorithmic model has been developed to predict the probability of default with an aim to reduce NPAs.
  • A model has also been developed for predicting different types of frauds in the banking sector based on RBI guidelines.
  • A web ­enabled software is also being tested, which when implemented will assist the banks to easily adopt the models for credit rating, non­performing assets and fraud.
  • Validation of the models has been done using data from several banks.
What is Big data ?
  • Big data is a term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis.
  • But it’s not the amount of data that’s important. It’s what organizations do with the data that matters.
  • Big data can be analyzed for insights that lead to better decisions and strategic business moves.
  • Big data is often characterized by 3Vs: The extreme volume of data, the wide variety of data types and the velocity at which the data must be processed.
  • Volume : Organizations collect data from a variety of sources, including business transactions, social media and information from sensor or machine-to-machine data. In the past, storing it would’ve been a problem – but new technologies (such as Hadoop) have eased the burden.
  • Velocity. Data streams in at an unprecedented speed and must be dealt with in a timely manner. RFID tags, sensors and smart metering are driving the need to deal with torrents of data in near-real time.
  • Variety. Data comes in all types of formats – from structured, numeric data in traditional databases to unstructured text documents, email, video, audio, stock ticker data and financial transactions.
  • Although big data doesn't equate to any specific volume of data, the term is often used to describe terabytes, petabytes and even exabytes of data captured over time.
Source
The Hindu.



Posted by Jawwad Kazi on 25th Jun 2018