Monday, 21 September 2015

IoT in ATM Automation

By Sudheer P N

Growing demand from customers as well as stiff competition are forcing banks to install new ATMs in wide areas and spread its wings. It is a big challenge for banks to maintain ATMs and monitor its operation 24*7, especially across remote regions. Larger the ATM network, bigger is the challenge to handle them. Of course, many banks outsource this activity to companies, but this incurs significant cost. This makes it very difficult to get real time status of the ATMs.

Internet of Things (IoT) can play a vital role in controlling ATMs remotely. Banks should leverage the IoT technology for the same.

We shall see how ATMs can be automated and controlled remotely.

ATM Management

Typically, ATMs placed in a secure room will have many other devices such as:
  • ATM
  • Battery for power backup
  • Lighting devices – light bulbs, AC
  • Security devices – camera, alarm, door lock etc.
  • Kiosks
  • Phones etc.​
It is very important to obtain the status from the devices and control them from a remote location. This will ensure optimal use of the devices and thereby saves cost.

How to Control the Devices?

Various sensors listed below can be placed in the ATM enclosures to monitor the health of the ATM room:
  • Temperature sensor
  • Battery health checkup sensor
  • Light sensor
  • Motion sensor
  • Smoke sensor
  • Pressure sensors
  • ATMs of today also have a built-in mechanism to determine-
    • Less cash reserves
    • Card reader issues
    • Stuck receipts
    • Printer issues
    • Printer paper roller information, etc.
By using IoT, the regular information from sensors can be used and analyzed to take actions on devices.

Example usage:
  • If the temperature inside the ATM room goes below the specified limit, then the AC can be switched off, thus saving power.
  • If the ATM reports of low cash reserves, an alert can be sent to the manager in charge to ensure that enough cash is placed in the ATM. This will largely improve customer service as it is frustrating for customers to walk away without cash from ATMs.
  • Pressure sensor reporting in case of anyone breaking open the ATM is set up. An alert message can be sent to the nearest police station and the manager in charge, and the alarm can be switch on. The camera captures the video snippet / picture of the intruder, which can be sent to the concerned person for further scrutiny.
  • Batteries can be serviced based on the reports from the battery sensor. This would help in keeping the ATM in operational condition 24/7.
  • Information from the smoke detector automatically switches off the power supply and sends alerts to concerned person. ​
How Does ATM Automation Work?

  • ATM automation application can be developed as a browser app or android app.
  • Various sensors send data at regular intervals to the nearest gateway.
  • Gateway present in the ATM room relays all the sensor data to the remote IoT server.
  • ATM app receives all the required sensor information and displays it in the desired format to the user.
  • Actions to be taken will be sent as a command to the required device.
  • Data will be filtered at the gateway level, based on the configuration information set​.
Advantages of Using IoT in controlling ATMs
  • Increased customer satisfaction
  • Reduced ATM downtime
  • Cost saving during the long run
  • Admin receives real time information about the health of the ATMs at its finger tips​
As the demand for more ATMs increases based on customers' needs, banks are dependent on innovative technologies, which will help them reduce cost, retain customers and serve better. Utilizing IoT for ATM management in today's scenario leads to effective Cash Management.​

Wednesday, 26 August 2015

Virtual Tenant Network


By Harpreet Singh Dhillon

Virtual Tenant Network (VTN) is an application, which provides multi-tenant network facility on an existing network infrastructure. The traditional network infrastructure is configured as a silo for each department, which requires huge CAPEX & OPEX cost investment. Every department in an organization needs separate proprietary hardware, which need not be shared with others.

VTN provides a logical abstraction plane, which facilitates complete segregation of the logical plane from the underlying physical infrastructure. VTN divides physical infrastructure into multiple logical parts and maps it to the physical network. This will not only reduce complexity, but also provides better management of resources and brings efficiency in infrastructure. NEC Corporation of America is contributing in VTN project for open daylight.

Architectural Overview

VTN application has been divided into the following two components:
  • VTN Manager: It is the plugin that interacts with other networking components to build network for the end user. 
  • VTN Coordinator: It is an external application that provides a REST API interface to the end user to build VTN network. The VTN coordinator passes request to VTN manager, which serve the request of end users and build the desired network. 


Fig. - Architecture of VTN

Challenges in Current Infrastructure
  • Lack of flexibility and agility: The current network infrastructure does not provide any kind of flexibility across network devices. Due to this limitation, network devices do not support multi-tenancy and the network appliances are running in silo for each department. 
  • Complexity: Network is becoming more complex due to large and increasing variety of proprietary hardware appliances provided by various network vendors. 
  • Manually intensive management: Provisioning and configuration for network appliances are complex, manually intensive and time-consuming tasks. 
  • Huge CAPEX & OPEX investment: Each department requires separate proprietary hardware appliance, which is a huge investment for purchasing of new hardware and resources to manage the infrastructure.

Benefits of VTN
  • Reduce Capex: VTN provides multi-tenancy feature on a network infrastructure. Therefore, the same physical infrastructure will be used by multiple departments. This will reduce the need of separate hardware for each department and certainly decrease the Capex investments. 
  • Reduce Opex: VTN provides centralized management of network infrastructure through Software Defined Networking and eliminates manual efforts, thus enabling automation in a network. This reduces manpower costs and saves huge opex costs for an organization. 
  • Flexibility: The VTN facilitates easy, rapid and dynamic provision for new services in various locations. 
  • API support: The VTN supports REST API, which helps the network to integrate with the infrastructure orchestrator layer and automate the entire network provisioning and configuration management. 
Conclusion

The network technology of the present day is very frequently evolving. With the evolution of cloud infrastructure, organizations are facing pressure to cut down their OPEX and CAPEX costs. VTN is the best suited technology, which not only reduces infrastructure costs, but also eliminates the complexity in the existing infrastructure.

The Next Generation File System – ReFS

By Chetan Kumar

Resilient File System (ReFS) is the new file system for the next generation of Windows Operating Systems. This file system is introduced with Windows Server 2012 Operating system and is designed to overcome the shortcomings or issues that were faced in NTFS (New Technology File System), a prominently used file system so far. Future version of Microsoft Operating Systems and the applications (Exchange, SQL Server, etc.) has the added support for ReFS.

Why Use ReFS?

The large customer base that relies on Windows OS for running business applications wants a cost-effective and reliable platform that provides data availability and data integrity. As the amount of data required for operating businesses is increasing, there is a need to scale efficiently across different workloads. ReFS is better than NTFS in many ways. The most important advantage that comes with the new file system is “Resiliency”. This file system is built by reusing the code from NTFS engine to maintain high degree of compatibility with features of NTFS carried forward in ReFS.

Key Features
  • Integrity: ReFS helps to ensure detection of all forms of disk corruptions 
  • Availability: ReFS gives priority to the availability of data. This implies that if some corruption occurs, the repair process focuses on the corrupt area and does not require the disk to be taken offline for repair. It performs all repairs online.
  • Scalability: As the volume of data has grown enormously in today’s world, ReFS is designed to work efficiently with large sets of data without compromising on performance.
  • Proactive Error Correction: ReFS proactively works in the background by running the integrity scanner periodically and initiates the repair of corrupt data.
Deployment Use Case

ReFS has the capability to help customers store data, irrespective of the reliability of the underneath stacks (hardware and software). This minimizes the storage cost and reduces capital expenditures for businesses. Customers can deploy a Windows based file server attached to inexpensive storage like JBOD (Just a Bunch of Disks). Further, the deployments can include failover clustering, which uses a shared JBOD storage configuration.

Conclusion

ReFS is the new file system, which is believed to replace NTFS in the upcoming releases of the Windows Operating Systems and Application Software. It brings major improvements in terms of data integrity and reliability, which ensures that corrupt data is easily found and repaired. Hence, ReFS can be a widely used file system in the future that could reduce the total cost of ownership on Windows Servers.

Global Insurance Industry – A Paradigm Shift

By Ashish Mishra

Calling it a paradigm shift! In a world where being digital is not a very new thing, it also does not thrill our industries. Be it digital health insurance, digital health support for insurance or telematics, nothing is new or exciting now. We are in the ‘post digital’ era.

Then, where exactly is the paradigm shift? And, how different is it or would it be from the current techno-crunch?

I would share a different view - A shift where the real motive behind digital health is to ease the health services. It can also mean Usage-Based-Insurance or Pay as You Drive. We have some examples of it i.e. telematics.

For the discussion, an insurance plan equipped with much customization will be an ideal example for segmentation. Therein, an individual’s plan would be focused on only Mr. X or on a very small group of people, although it is not considered as a feasible solution so far.

However, the yet to be completely explored “Data and Analytics” can really help this industry reaching an acute level of market penetration. We have started seeing proliferation of big data and various analytical techniques for extracting different meanings through it.

Connecting the same point, enhanced end user experience and satisfaction can be sought through simplifying the organizational functioning. Its business model will be key to recapture the market. If the investment through the accumulated money is turning red, then profit maximization of can be achieved through operational efficiency. The implementation of data analytics would also give a competitive advantage, along with increased profitability.

But, what else would companies acquire? Certainly, it shall be loyalty from their customers, cost savings and scaling their businesses. At the same time, the companies will be facing bigger challenges. For instance, many companies will be with the similar modus operandi. Customers or end users will be generally accepting all the advertisements.

Thanks to the Sleeper Effect, which slowly fades away the source or credibility of the advertisement messages and eventually, a user is left only with the persuasive message.

Coming back from the tracks of “Let’s reap all the benefits” (precisely companies expect so) to “Let’s all reap the benefit”, will certainly turn the tables.

Data security can also be a primary policy driver. Companies can upgrade their core, seamless and robust systems, and implement the latest insurance business model that focuses more on data security. This model could be next key winning factor for insurance companies, which enables them realize greater responsibilities towards their customers. An insurance company that can promise data security and also deliver 100 percent results, will always stand out as a winner. The more an insurance firm anticipates challenges and handles data security diligently via technological readiness, the lesser is the failure and its impact. These are the major factors such companies require to focus upon.

In my opinion, designing of any innovative technology is everything! Designing of securities, data centers, big data and its architecture is going to be the focal point in today’s world. And, from “Big is beautiful” to “Small is beautiful”, and eventually, just “Being beautiful“ will apparently drive the entire ‘post digital’ era.

Thursday, 20 August 2015

Internet of Things (IoT) for Developing E-Labs

By Swathi K

With the enormous growth of the internet and its usage, the demand for more technologies and applications in every field is ever growing. Users are no more comfortable doing their chores in person; they rather opt to obtain everything at their fingertips through various technologies.

Laboratory experiments are an integral part of science and engineering education. Automation is changing the nature of these laboratories, and the focus of the system designer is on the availability of various interfacing tools to access the laboratory hardware remotely with the integration of computer-supported learning environment. In engineering, laboratories have had a central role in the education of engineers. The first kind of distance education included graduate programs intended primarily, if not solely, for part-time students who were employed full time. Since most graduate programs do not include a laboratory component, the question of how to deliver laboratory experiences did not arise. As undergraduate distance learning programs started to develop, this problem demanded a solution, and IoT (Internet of Things) is the solution.

A Remote Laboratory is a workbench, which enables us to remotely conduct real experiments. With the growth of the Internet and IoT, Machine-to-Machine communication has grown its demand for new technologies to connect billions worldwide.

E-Lab provides users the ease to explore operating hardware even if the user is physically present at a separate geographical location, connected through their regular mobile devices. An E-lab is built on wireless sensor technology that enables the user to exchange data and control the machine. This can be achieved with a simple combination of mobile service and applications.

This approach presents the novel design techniques of hardware system that develops a remote/wireless Electronics Lab. The approach is to use the GSM to provide students with remote access to physical laboratory apparatus. An application (app) is created with help of JAVA to handle the hardware. Systems of this type are synchronous, giving students a sense of actual involvement in the experiment. A PC is used with Dotnet programming to interface the webcam option. The PC will be interfaced with microcontrollers for controlling different units. The PC uses Internet service to E-mail Video clippings of Hardware setup and its working to the students’ e-mail Id. This also provides safety measures for all humans involved in the learning process. Since GSM mobile phones are now widely used, this is the best and easy way to access any remote laboratory.

Advantages
  • The user can login and carry out experiments from any geographical location.
  • The user will have no time constraints. 
  • A remote lab provides extended access to expensive and highly specialized devices or setups.
  • Economic usage - sharing labs also reduces the utilization and maintenance cost.
  • Provides security through user authentication and has no risk of catastrophic failure.
  • Provides safety measures for all humans involved in the learning process. Any damage during the conduction will not harm anyone in person. Example: Advanced high–voltage or chemical laboratories. 
  • Increased efficiency - improved communication leads to faster transfer of information.
Future Scope

Live watch - to enhance distance learning, laboratories can provide LIVE streaming or provide video clippings with the use of internet service.

Microsoft Exchange 2016: The vNext

By Chetan Kumar

The upcoming version of Exchange Server is built on the architecture of Exchange Server 2013. The version next is further refined to cater and suite the deployments of all scales. New version of the product is evolved from Office365 and enables both On-Premise and hybrid deployments. The official release of the product will be in 3rd Quarter of 2015.

Architecture

Exchange Server 2016 eliminates Server roles completely and hence is a ‘Single Role’ product. Microsoft pushed the product to a new version to simplify the product architecture and improve its capabilities. Refer Figure 1 the architecture changes in the new product.


Figure 1- Architecture Design

The mailbox server role hosts all the components to process/render/store data and contains the logic to route requests to the correct target endpoint. With the elimination of CAS role, communication between the servers still occurs at the protocol level.

Improvements
  • Search Improvements: Search is improved for Outlook online mode clients. Network bandwidth requirements between Active and Passive database copies is reduced. 
  • Document Collaboration: Integration with Office Web App Server added the functionality of editing documents in Outlook web access. 
  • Extensibility: REST APIs are now available in Exchange Server 2016 that allow the developers to connect from any platform and simplifies programming for Exchange.
  • Outlook Connectivity: MAPI/HTTP is the default protocol enabled for users in Exchange Server 2016. 
  • Coexistence with Exchange Server 2013: It is comparatively easier to move or coexist with your existing Exchange Server 2013 deployment. 
Key Benefits to Customers
  • Simpler Deployments: With the Exchange Server 2016 architecture, identical Exchange Servers with respect to hardware, configuration, and so on, make deployment simpler. 
  • Reduced Infrastructure/Software cost: New architecture of Exchange Server 2016 reduced the number of physical Exchange Servers as compared to previous versions, which lower the investment in infrastructure, operations and software licensing. 
  • Faster Recovery: The new version provides faster failover/recovery of databases (around 33%), disk IOPS are further reduced, automated database repair detect and fix divergent database copies. 
Conclusion

Exchange Server 2016 is laid on proven architecture and flexible future-ready foundation, which continues the vision of reducing the complexity of architecture. Customers running the previous version of the product can easily adapt new versions of the product to their existing deployments. The new product version helps to simplify the messaging environment and increase the availability/resiliency of the deployment.

Thursday, 13 August 2015

Affinity Management Services

By Harpreet Singh Dhillon

The network services facilitate communication between different infrastructure elements like virtual machines, storage and the end-user. The conversation between these elements is called ‘Affinities’. In today’s world, applications are dynamically changing, but the network is still running as a silo without any intelligence. The applications such as openflow, netconf and SNMP, which are introduced in the software defined network are only concerned with network device performance and do not focus on application performance.

Affinity management service is an API in a SDN network, which maps application needs with infrastructure. This service can be formally described to a centralized SDN controller in all aspects of the applications like workloads, communication patterns and bandwidth needs. The SDN controller will determine what sort of path the application should take for e.g. latency sensitive application that needs to be provisioned to lowest hop count or bandwidth intensive application that needs guaranteed bandwidth inside the application path.
                     
Architectural Concept of Affinity Management Services

Challenges in the Current Infrastructure
  • Lack of intelligence and flexibility: Current network infrastructure does not provide any kind of flexibility and intelligence in network devices. The SDN protocols like openflow, Netconf and SNMP are only concerned about network device performance and do not support application performance.
  • Huge business loss: The current network infrastructure is facing issues like network traffic congestion and latency, which degrades the performance of business applications and results in huge business loss to organizations.
  • Manually intensive management: Presently used traditional network is manual, which requires manual efforts for configuration to improve application performance. Thus, it is a very time consuming task and builds complexities in a network environment. 
Benefits of Affinity Management Services
  • Improved application performance: The SDN controller uses affinity information and optimized network to improve application performance up to a great extent. The affinity describes network resource needs in terms of service level rather than per device configuration.
  • Easy customization: The affinity management service provides user interface directly to the end-user where he can easily use customized network as per the application demands without any intervention of the administrator.
  • Scalability: Dynamically scale network resources based on application requirements.
  • Automation: Affinity management services provide automation in the network as SDN controller gains excess to workload communications and can easily automate the entire optimization of network resources’ usage.
Conclusion

Enterprises are moving towards highly dynamic infrastructure, which maps to their business applications and generates huge return on investment and lowers investment costs. Affinity management services are the best solution for enterprises, which can optimally utilize the entire infrastructure and generate high network performance. This will map the application needs with infrastructure and provision network resources as per application requirement. The affinity management service refines the way network resources are deployed and dynamically change entire architecture of traditional datacenters.

Thursday, 6 August 2015

Driverless Cars and Insurance

By Harshith Ail

Technological innovation has been ruling the present day world. Individuals encounter a new technology in every walk of life. And just when we think innovation has reached its saturation point, we come across a new technology, which was never thought of before. Driverless cars are one such innovation.

Equipped with self-driving mechanism, driverless cars have become one of the most awaited cars in present day world. These cars are currently under road testing and will take around a decade before they are available for the general public. However, it is already being forecasted that this innovation could be a game changer for many industries.

This new technology could bring with itself challenges to insurance companies and regulatory bodies in the countries wherein it is brought to practice. The California Department of Motor Vehicles has raised a concern that "The technology is ahead of the law in many areas" and states that, as per the law, a human being should operate the vehicle. There is hence an argument between policy makers and regulators that a new law should be brought into picture for such a breakthrough technology.

Impact on Insurance Companies

Driverless cars could also have an adverse effect on insurance companies, as these cars are known for their low collision rate. Fewer number of collisions means fewer insurance claims. This will prompt the insurance companies to lower their premiums, which will in turn result in a challenging environment in the insurance market and increased competition. The blend of driverless cars and Usage Based Insurance (UBI) could change the way automobile insurance is operating at present. Moreover, these cars could also affect the ROI of the insurance companies, as these companies will now concentrate more on research.

Driverless cars work using the internet and inbuilt cameras. Hence, there is a possibility that insurance companies could tie up with these device manufacturers to gather data. Insurance companies with legacy applications could face more challenges. It may become necessary for them to upgrade their applications to support these devices. This need translates into an opportunity for the IT industry.

In case these cars come into use, the companies which make the driverless systems will have to bear the blame for accidents, but not the drivers. . On the other hand, general public is with an impression that this can be an alternative for errors on part of the drivers, like drinking & driving, not following lane discipline, etc.

Conclusion

This new technology will make an insurance company to adopt different approaches towards designing their products. It is too early to decide whether it will be a boon or a bane for insurance companies, but, it can certainly be said that automobile insurance will not be the same anymore.

Friday, 3 July 2015

New Generation Payments

By Saranya Haridass

As the payment industry grows across the globe, newer and convenient ways of making payments are the current trend, wherein we are moving towards cashless payments. The next generation of customers is looking at using their favorite things as payment devices.

In this direction, the emerging new payment technology is contactless payments, extensively carried out through mobile payments. Contactless payments are also known by the name ‘wave and pay’, ‘touch and go’, ‘scan and go’ and ‘tap and go’. Payments can be made without the physical contact between the payment device and the terminal, with the help of radio frequency technology. These types of payments have lower limit, i.e., the amount is limited per transaction. Payments of this kind are most effectively used in transportation services, parking, fast food restaurants and vending machines. Because of its lower limit and secured features, there is a remarkable increase in the usage of contactless payments. Fraudsters might not be interested in contactless payments as the amount limit is low.

Contactless Payment Devices

There are quite a few options available for contactless payments such as Contactless cards, contactless wrist bands, contactless key fobs, contactless mobile, contactless stickers and many more.

Benefits
  • Faster - The time is saved considerably, thereby speeding up the transaction process during the peak time as the PIN or signature is not required usually. As a part of security requirement, sometimes the PIN might be required to complete the transaction.
  • Customer friendly - These types of payments are more convenient for the customers to use, as they do not have to enter PIN, sign the bill or take cash for the low cost payments.
  • Highly secure - Contactless payments are more secure as the devices have a unique built-in key, which generates unique code for each transaction. Any attempt to pay using the same transaction information will be rejected. The payment device is safe as it will always be with the customer during the entire transaction process. No other details such as customer name etc. are communicated to the device apart from the amount alone during the time of the transaction. 
  • Protection - Similar to other payments, contactless payments are also covered by fraud protection regulations. Money will be refunded to the consumers’ accounts in case of fraudulent transactions.
 How it works
  1. Payment device and the card reader both should have the contactless symbol indicating that the contactless payment can be executed
  2. Merchant enters the payment amount in the card reader and the card reader prompts for the customers to present their payment device
  3. Customer should verify the payment amount and place the contactless payment device above the contactless icon in the card reader less than 4 inches of distance 
  4. The terminal reads the data from the card
  5. A beep sound or green light indicates that the payment is being processed
  6. Customer will receive the receipt for the approved payment
Conclusion

In the coming years, contactless payment devices will substitute the plastic cards and other types of payment options. With the increase in security and customer confidence, the technology is slowly moving towards the cashless way of life.

Utility of next generation technology explored at its best!

Payment Tokenization

By Santosh Srinivasa

Background

The payment industry has been facing the challenge of providing solutions for payments that protect against various types of frauds like counterfeit, theft, account misuse and others. The implementation of EMV chip and usage across the globe has provided protection for card-present transactions, while there is a need for something similar for the card-not-present transactions and for new environments, which combine elements from both card-present and card-not-present transactions. The payment tokenization technology promises to address this issue.

The industry has spent a lot of time working on this; Apple Pay, introduced in October 2014, used the tokenization approach provided by Visa, MC and AMEX as an answer for this. Visa is planning BIG through the NFC tokenized mobile payment approach, which will change the world of payments.

When we usually make a payment for something, we handover the card to the merchant and wish for the cards details to be safe and secure. The card details from the merchant are sent to the acquirer through a switch and reach the issuer via the card schemes for authorization.

The emergence of mobile payments has created the risk of relying on the phone itself to carry out many transactions, thereby tracking/monitoring the device for fraud prevention. The new solution will convert all the sensitive card related information to a single-use token generated by a third party system. This makes it difficult for the hackers to access any data or use it for purchases.

The concern with the merchants is of not having the PAN used for more than just the transactions; bonus/loyalty points and dispute claims. The obscuring of PAN leads to merchants not possessing important data for processing. The token service will ensure that only a portion of the PAN is masked with the first 6 digits available.

An Overview of Tokenization

Tokenization is a process where the PAN (Primary Account Number) is replaced by a surrogate value called as ‘Token’. The process is very secure with properties, making it difficult to determine the original PAN from the token. The token will be mapped to the PAN and used by other systems and applications within the environment.

The tokenization system once implemented, limits the storage of card holder data by the merchant as per the PCI-DSS compliance. The token system will generate random numbers unrelated to the used PAN to make payments. This is the technology driving the Apple Pay service, keeping the credit card details safe and secure on the mobile phone device, which cannot be transmitted. The merchants cannot see the real PAN and in case of any security breach, the customer need not apply for a new card.

Tokens and Data in Payment Cards

There are many mobile wallets available in the market like – Google, Venmo, Payfone, ISIS, CSAM and mPOS solutions, which can be considered for this service.

The tokenization solution will benefit acquirers, merchants, card issuers and card holders.

The main components in this are:
  • Token generation
  • Token mapping
  • Card data vault
  • Cryptographic key management
The ISO standards applicable for this are ISO 7812, ISO 8583, ISO 9564, ISO 1349, ISO 27001, PCI-DSS

Conclusion
The security of using mobile payments with Apple Pay and Android devices is the key for the growing market. The token service will be the future, supporting the token application on the mobile device. The customers shall be satisfied with the payment applications providing this facility and will not have a clue about the token service.

Visa has already implemented the token service in the US and is looking forward to cover it worldwide in competition with the others offering NFC-HCE (Near field communication - Host card emulation) solutions for the same.

The future for online payments is mobile payments with token service.

Thursday, 11 June 2015

Essential Aspects to Secured and Protected Cloud Data Storage

By Thejaswini J

A cloud data storage system is a network of vast distributed data centers that implement cloud computing technologies such as virtualization, and provide a user-centric interface for storing all types of data.

Cloud computing has become the topmost priority of several technology-oriented organizations, and thus, data storage has easily made its way into the virtual space. Data security and privacy are more important in the virtual space at all the levels, be it IaaS (Infrastructure as a Service), SaaS (Software as a Service), or PaaS (Platform as a Service).

Let’s evaluate the scenarios of data security and privacy and make well informed judgments about the risks involved for users and organizations. When it comes to data, not all the data is of equal importance. For instance, if we consider the use of private cloud versus a public cloud, we can clearly classify between sensitive data and non-sensitive data.

We cannot deny the flexible access and storage facility that cloud computing offers! Yet, a substantial gap remains between the perspectives of the vendors and users about the transparency, privacy and security of cloud storage. Although the cloud industry defends clouds to be much more secure than what we currently use for data storage, issues pertaining to privacy, security and availability are the foremost concerns of any organization / user when it comes to cloud adoption.

Security and Privacy

Data privacy, data management and data security are the main concerns of every user as well as a cloud service provider, as more number of companies is seeking into understanding the security aspects prior to availing the cloud service. The biggest threat of losing the data is through hackers who can break into online servers and slip away with the data, and there is no way of retrieving it as it was on cloud.

Cloud security depends on various factors as represented below:


Storage of data on cloud entails to management of virtual data, physical data and the machines, which are entirely delegated to cloud service providers. Users like you, on the other hand, retain limited control over your data and virtual machines.

Laws and Legal Obligations

Depending on the country, there exist laws as well as legal provisions that every cloud service provider needs to adhere to while processing and storing your data. The law enforcements, although, may differ from one country to another. Three main parties that are involved in storing, processing and retrieving data – Cloud storage service providers, subcontractors who offer infrastructures / resources for cloud providers, and you – the cloud user who wishes to store your data ‘safely’ on cloud. Each of these parties is subject to follow the legal aspects, which concern to personal rights such as data access or security rights. Overall, it must be understood by everyone engaged in accessing, providing and availing cloud services that, abiding by the law of the land is a must. Hence, you, as a cloud user should consider the legalities involved, as you are primarily accountable for your data and its processing.

In Conclusion…

Even as cloud computing brings several advantages, data security and protecting privacy remain as major concerns, and need to be addressed as data is prone to threat, especially while sharing or retrieving information. Healthcare, bills & payments, online banking and e-commerce systems are the ones that require maximum security and protection as they store information about credit / debit cards, transactions and health data. So, the ability of today’s technology to protect and control as to what information to reveal and who can gain access to that information is a growing concern in this huge web world! So, cloud users, stay vigilant!

An Overview of Crowd Funding

By Mahesh D N

April 14th of 2013 was a monumental day in the lives of Veronica Mars’s fans. More importantly, the day is now remembered for when Crowdfunding truly became a global success story.

Crowdfunding is defined as the practice of pooling money from public sources on an online platform instead of the traditional method of loans provided by banks or financial institutions.

Business Model: A successful phenomenon hinges on a good business model in place with different stakeholders taking part and Crowdsourcing is no different. First and foremost, we have funders, who are the general public using their own means for donating money towards the project or cause. On the other hand, fundraisers are generally entrepreneurs or individuals inviting the donation of money towards their dream project. Lastly, we hav
e platform providers, which are online websites that help fundraisers in realizing their dream project. DonorsChoose, Kickstarter, Sciflies, Indiegogo, USeed, Prosper and RaisingSocialare some of the popular platform providers.

Stakeholder Analysis: A platform provider can be perceived as a website, which gives an entrepreneur the freedom to express himself explicitly and attract attention regarding the project. Crowdfunding provider also takes upon itself the freedom to use various marketing maneuvers to bring in donations for the project. Entrepreneurs research those platforms that have had varied levels of success histories and have enough word-of-mouth marketing to deem themselves as being popular. Fundraisers also put the amount they are seeking for within a stipulated time, publish the target amount and an estimated time on the platform for reaching that target, while posting their project idea. Funders are those who patronize the site on a regular basis and fund a project, which they deem to be interesting by transferring their money online.

Profitability Analysis for Funders: Funders who successfully fund a project are not just in it for giving their money away. They expect returns and are enticed by different measures. The major one among them is gifts. Under the gifts model, we have the “Keep-it-All” model that gives the fundraiser the security of keeping the money donated all to himself even if the target is not met. The “All-or-Nothing” model means the fundraiser has to return the entire money donated if it falls short. The second model is more popular as it assures investors that only a quality project will be completed. Another reward is in the form of equity, whereby funders have the option of sharing the profits of the movie while they partake no responsibility, even if the project is a failure. Finally, we have the ‘Peer-to-Peer’ lending, where funders enjoy the interest received on the money donated by them to the project after credit terms are agreed upon.

Conclusion

Crowdfunding has caught on in the last decade and is very much a modern phenomenon. It gives entrepreneurs the freedom of no longer being held hostage to banks. At the same time, it provides a platform for the general public to bring to fruition a project they long for and believe in.

Thursday, 4 June 2015

Unified Data Management Drives Value in Healthcare

By Prabakar M

Data Management in Healthcare and Need for UDM

Many establishments today are managing data in secluded silos through independent teams via numerous data management tools, to ensure quality of data, integration of data, governance of data and managing master data as well. Numerous organizations of today are looking forward for a better data management techniques like unified data management (UDM). UDM is a practice that coordinates different teams and assimilates data. This system also has some common names in different industries like enterprise data management, enterprise content management and enterprise information management.

The common challenge in the healthcare ecosystem is to work towards the betterment in the quality of patient care despite the increasing cost. At the same time, there are mounting pressures for healthcare organizations to implement the latest regulations and healthcare reforms. Considering the volume of information that is handled in a healthcare setup, like patient data, hospital data, provider data, payer data and much more, implementing a better data management practice is a need of the hour for better efficiency and decision support.

Similar to several other industries, each and every linkage in the entire healthcare system, starting from providers to payers, is contending with extraordinary amounts of data. As the data is now expanding in such a way that it includes big data, a healthcare set up requires following a unified approach to manage data. This simplifies the integration of conventionally represented structured data with other challenging forms of unstructured or semi-structured data.

Unified data management encourages advanced analytics via latest technologies that can produce better outcomes and facilitate more accurate insights based on the data. Applying UDM in large amounts of data that come from remote device management systems also enables more modified services and treatments for patients.

Benefits of UDM
  • ​Manages data more efficiently, especially in a healthcare setup.
  • Reduces costs throughout the system while ensuring strict adherence to regulatory standards and compliances. 
  • Ensures better patient care, higher patient satisfaction and improved outcomes. This also improves the administrative, clinical and financial operational efficiency in a healthcare eco-system. 
  • ​Manages data efficiently from multiple sources, such as hospital applications, payers, suppliers and intermediaries, thereby eliminating chances of inefficient data management.
Take Aways

Unified data model for healthcare helps to solve complex issues involving multifaceted data, make the most of the value of amorphous data in population management, convert insights into action via reporting and analytic data representations and respond to changing healthcare requirements. This new approach towards data management assists healthcare entities to comprehend meaningful use of their data assets.

Active Directory Federation Services: Why should you use it?

​By Chetan Kumar 


Active Directory Federation Services ( ADFS) is an identity access solution from Microsoft that provides web-based clients (internal or external) with one prompt access to one or more Internet-facing applications, when the user accounts exist in different organizations and the web applications are located in altogether a different organization. ADFS lowers the complexity of password management and guest account provisioning. It can also play a significant role for the organizations that use Software as a Service (SaaS) and Web applications. Refer to the Figure 1 below; users in Organization A use their Windows credentials to log in and ADFS authenticates access to all the approved third-party systems in Organization B.


Figure 1- ADFS and Single Sign-On in Organization A & B

Key Challenges that ADFS addresses

Prior to ADFS, many organizations used to deploy separate Active Directory for authentication and authorization for third parties in order to use their services. In majority of the cases, you could result in becoming an account administrator for external users that may expand rapidly when they need their password reset, have a new account added and so on.

The other challenge is around de-provisioning of users. You have no control over the users that leave your partners, but their accounts remain active in your AD. This may result in security incidents if an employee still has access.

Should I use ADFS?
  • If you want have the requirement to allow users from another business (Contractor/Partner) to access your internal resources (web applications, messaging services & so on). The practical example suitable to many organizations is outsourcing, where your partners/contractors access your resources for supporting your business functions. 
  • If you are planning to move some parts of your IT to private or public cloud and want all the security factors to be seamless for the users. For example: In the case of a hybrid environment, some internal users are moved to Office 365.
Key Benefits
  • ​​Single Sign On (SSO)
  1. Minimizes password phishing 
  2. Helps to minimize the need for repetitive logon exchanges 
  3. Reduces the repetition and submission of user credentials that can lead to higher helpdesk support cost and end user exhaustion
  • Industry Standard Identity protocols supported - Compatible with various security products/solutions that support the WS -* Web Services Architecture 
  • Eliminates the management of user accounts in a partner organization 
  • Extensible architecture - Provides an extensible architecture. For instance: Addition/modification of claims using custom business logic during claims processing.
Conclusion

ADFS is a very flexible technology by Microsoft that provides authentication and authorization to applications running in your environment to the extranet users of different organizations. If you are planning to extend Active Directory outside of your environment or transition to Office 365 or cloud, and want to reduce user account administration effort while providing claims-aware federation, ADFS is the perfect solution you can rely upon.

Spend Analysis

By Reddy Balaji C

Introduction

In a globalized business environment, one of the major concerns for any business is to retain the existing customers, rather than attracting new ones. The cost of migrating to a new vendor for a customer is considerably low and much easier. It has become a challenge for the banks to enhance or tailor the offerings and to extend the banking product features or to unveil the untouched business areas. Customer spend analysis is a critical success factor for any business.

Challenges in Spend Analysis

Though spend analysis is a useful activity for any company, there are number of constraints in adopting it:
  • The required information is spread across various sources like general ledger, accounts payable, bank transactions etc. So, data consolidation and bringing it to a standard format is challenging
  • Some important information like merchant information, product category may be incomplete or incorrect
Spend Analysis - Solution

Spend Analysis solution helps banks / customers to visualize a consolidated view of the complete spending behavior spread across complex, multi-level accounts across several types of offerings. It encompasses the activities starting from data collection to intelligent business information derivation, which gives in-depth insight about the spend patterns. The analytics outcome is represented in highly dynamic and interactive dashboards.

Core Areas of Spend Analysis
  • Data integration
  • Data cleansing and transformation
  • Dashboard and advanced analysis
Analysis Attributes

These are the different analytical attributes that can be used for exploration. Further, analysis is performed instantly (in-memory analytics) and the reports are generated in various formats.
  • Source for the transactions
  1. Credit Card
  2. Debit Card
  3. Online transfers
  4. ATM cash withdrawals
  5. Cash payments
  6. Internal transaction books
  • Date-time
  • Region
  • Merchant Category Code (MCC)
  • Category of item
  • Brand of the item
  • Amount spent
Key Performance Indicators (KPIs)
  • Data aggregations based on various dimensions/parameters
  • Spend patterns based on the date/season
  • Top least product/s categories sold in the given filter criteria
  • Future trend pattern based on predictive analytics
Spend Analysis – Big Data - Predictive Analytics

Since huge volume of data can arrive from different sources and its formats may vary depending on the sources, it's a challenge to follow the traditional BI approach and attain value out of the received data. To leverage huge and heterogeneous data scenarios, big data technologies like Hadoop can be adopted and analytics can be applied to derive knowledge out of the data. Since Hadoop can ingest different formats of data very easily in its file system, it becomes really simple to process this data in a distributed manner. Along with the static analytics on historic data, predictive analytics moves one step forward by calling out unique indicators for past events, which can be used to derive futuristic information.

Conclusion

Analysis of the available data and strategic decision making is imperative in large organizations. Spend analysis helps corporates to perform systematic data collection and extensive analysis on the data. Also, the predictive algorithms implemented will give futuristic projections, which will contribute in effective decision making.

Security and Privacy Issues in the IoT Realm


By Sudheer P N

The advent of 'Internet' brought in a paradigm shift in the way humans learn, communicate, work etc. Internet was a revolution and humans benefited from it in every aspect of life. Along with its success, internet also brought in major security risks. Despite having security systems in place, attackers have been successful in breaching the security and hack the systems.

As internet evolved over the decades, security mechanisms also evolved from firewall to unified threat management system (UTM), encryption and stronger authentication to wage war against intruders. As the security solution of today may not work tomorrow, the only way to be safer and smarter is to periodically update the security patch.

Like the Internet, 'Internet of things' (IoT) is gaining momentum with time. No doubt that IoT would be the next big thing in future. Using IoT, everything can be controlled and monitored remotely with ease. As per the research, IoT implementation would start from the year 2015 and reach to its pinnacle by 2020. As IoT is in its nascent state and evolving, it also poses a lot of security threats. Let's discuss some of the security risks this technology can bring:

Reasons for Security Threats
  • Connected devices
  • Lack of availability of IoT standards
Connected Devices

Any sensor, which connects to the IoT server and sends out information, will be a connected device. Smart meter is one such example. Smart meter sends water consumption information at fixed interval to the server.

Connected devices are vulnerable to various attacks in the IoT ecosystem.

Why connected devices are prone to attacks?

Connected devices are built with low power, having less memory, slow processor and run on embedded operating system (OS).
  • Embedded OS is not designed to handle security issues. As access control is not present in OS, the devices are vulnerable for attacks.
  • Due to low memory, firewall capability cannot be embedded within the device. Hence, the devices have no support to guard intruders.
  • Slow processor makes it difficult to validate and encrypt/decrypt data in real time.
Security Issues
  • Distributed Denial of Service (DDOS) attack: Enterprise network will have many connected IoT devices. Imagine, if all the compromised devices try to access a company website or try to access unavailable information from a server. This will choke the server and make it slow. These attacks cost heavily for the company.
  • Attackers gaining access to the devices can introduce viruses, BOTS, Trojan horses etc. 
  • Hackers can manipulate the data generated from devices. For example, wrong information about the trains on track would be devastating, and can lead to head on collision, costing thousands of lives.
  • Difficult to update software patches on compromised devices.
Privacy Issues
  • Compromised smart water meter would give a hint to the attacker about whether or not a house is occupied.
  • Hackers can easily manipulate sensitive personal information like health, location, bank account details etc.
IoT standards

Groups like ETSI, OneM2M and IEEE are working towards generating a standard for an IoT ecosystem. But, these standards are still evolving. Its capability can be felt only after deployments.

Solutions

Below stated solutions can resolve security concerns in the IoT space:
  • Standards should evolve fast and should be followed end-to-end.
  • Device manufacturers should come up with a device having more memory, faster processors and that which are compact in size. But, this would take time to reach the market.
  • Device should authenticate itself before sending or receiving any data.
  • Role-based access control should be built into the operating system used in devices. This would limit the component usage.
  • Devices should have deep packet inspection capability to validate the data received for any kind of attacks.
  • Authenticity of the software on the device should be verified by a cryptographically generated digital signature.
  • Devices should authenticate the regular software updates it receives from admins/operators.
​The only way that IoT can be successful is by having a security system at every component level in its eco system. As many players are investing in IoT, if all the measures as mentioned above are followed, IoT shall be a greater success than we have already imagined today.

SDN: A Transformation Milestone in the Networking World

​By Sivabalan K


Before introducing SDN (Software Defined Networking), one should look into the evolution and limitations of various computing resources, which include CPU, memory, network and storage, where everything is physical, tough to maintain, and very hard to scale. But, things started changing. Thanks to virtualization! Virtualization products like VMware, Microsoft Hyper-V, KVM etc. could virtualize CPU, memory and the storage (virtual hard disks) to a larger extent, but, the networking resources could still not be effectively virtualized. Due to this, whenever a new VM is provisioned, required networking resources have to be created beforehand, so that the VM can use that network resource to communicate with others in the network.

With the advent of cloud computing, the provisioning of the above mentioned virtualized resources like VMs were highly automated. Yet, the networking resources lagged behind in this process to a level that networking resources had to be deployed and configured manually, which required more hardware resources, and increased the energy consumption and manpower requirement as well.

Then entered a concept called Software Defined Networking. SDN is a new, emerging technology that will decouple the decision making layer - the "control plane" from the networking layer, which actually forwards the network traffic to the destination, called as "data plane", in SDN terms. The separation between the control plane and the data plane open up possibilities for network administrators to control and configure the entire network just by accessing the control plane, instead of accessing each and every node in the network. This makes management of the network seamless and simple. This also enables SDN to seamlessly integrate with various cloud-based platforms, although the SDN implementation for each cloud platform may differ.

SDN addresses the following network limitations:
  • Vendor dependence 
  • Expensive to maintain traditional networks 
  • Each network router that has its own proprietary software 
  • There is very little room for innovation as the software for the networking equipment is developed only by their vendors. 
  • Managing and reconfiguring complex networks with the addition of a new machine 
  • Difficult task for IT departments to manage it, which exposes the enterprise network to non-compliance of regulatory compliances, security breaches, and other negative consequences.
Some of the pressing needs for SDN:

The network traffic pattern changed from a traditional client server model to a state where a single application draws information from multiple databases before returning the end result to the users. With the advent of BYOD (Bring Your Own Device) for IT administrators, it is a nightmare to manage all these devices in a corporate network, and at the same time protect confidential corporate data. The rise of cloud services, both public and private cloud, also increases the requirement for SDN. Finally, there has been the emergence of big data, which needs very huge parallel processing of thousands of interconnected servers. This requires an additional capacity in those data centers with dynamic scaling capability.

SDN Architecture

Source: https://www.opennetworking.org/images/stories/downloads/sdn-resources/technical-reports/SDN-architecture-overview-1.0.pdf

A simple pictorial depiction of SDN


Traditional Computer Networks: 
  • Forward, filter, buffer, mark, rate-limit, and measure packets 
  • Track topology changes, compute routes, install forwarding rules​


Software Defined Networking (SDN)

  • ​Logically-centralized control​


Source: https://www.cs.princeton.edu/courses/archive/spring12/cos461/docs/lec24-sdn.ppt - (Slide # 11)​

Ongoing open source SDN controller projects:
  • ONOS 
  • Project Floodlight 
  • Beacon 
  • NOX/POX 
  • OpenFlow 
  • Open Daylight (controller baseline project upon which many other controllers are built) 
​Conclusion

As the world is moving towards cloud based services, the networking resources also need to be upgraded to the next level to match the new cloud based environment. SDN is the right step towards that direction, where we can have tailor-made SDN implementation for different cloud computing environments. Although it will not fulfil all the requirements of cloud computing, it serves most of them.

Thursday, 28 May 2015

Ex-Decade with IoT

By Pradeep Pavaluru

This is year 2025, the past decade has seen some of the significant disruption in the tech world, and one among many disruptive technologies is the "Next Big Thing", the IoT. This influenced and empowered growth in multiple folds with enterprises that had huge bets over last 10 years.

The IoT started with a hope of changing the M2M (Machine to Machine) communication, and it was taken with a surprise and became most disruptive among things. In many ways, this changed domains and verticals on how to communicate, work and collaborate with insights on analytics, monitoring, mammoth data and data lakes.

The world has witnessed this change, and I was part of this complete tech shift (like the tectonic plates) over last few years. The things, especially in the IoT space have grown in both ways, where devices are getting smaller and smarter with more capable and powerful instances. Following are few fields worth a mention based on my recall cap:

Healthcare: The last decade has seen a significant impact in the human lives and their sustenance. Numerous numbers of IoT devices and applications are helping track minute-to-minute changes in the human body. Along with this, parallel monitoring has also commenced by the respective enrolled hospitals, which give an extended life for about 60 days. In case of an emergency, devices and applications have capabilities to give instant diagnostic references and inform emergency dispatch units.

Financial systems: The concept of e-wallets has changed from my era; no need to carry your mobile as well.

Thanks to HoloLens (from Microsoft) and a host of other vendors for providing solutions on Hologram technologies. Today, we use no more plastic cards. With the help of Hologram technology, your body will act as the approver for your bills, payments, etc.

Utilities and Home: A decade ago, we use to hire maids for cleaning, cooking and house hold activities. But, now, it's been serviced by intelligent devices, controlled by a single tap on my smart screen. With the devices in place (smart metering for utilities), the devices will generate a bill and push for the payments (10 years back, we used to stand in queues or experience heavy website traffic etc.) and also renews automatically.

Counter Terrorism: With Internet of Things, we are able to successfully counter terrorism activities in and around Line of Control (LOC) by the use of:
  • SmartDogs help in sniffing the danger and identify trails of scent; it can cover an area of .5 Square Km. In India, for example, there are around 2 million Smart Dogs deployed in and around the country borders, which regularly transmit data for analysis.
  • iBeings / iBorgs are planned as another option and currently under Beta testing through the usage of power of device clouds (controlling an army of intelligent devices). Also, these Borgs can be bought to perform riskiest jobs like under-sea exploration, deep jungle scanning, exploring the live volcano, and so on and so forth.
Smart Cities: With many cities being converted to smart ones, India is already registering 12 smart cities - fully functional, having commendable control on security, safety, reliability and economics. People who live in smart cities can control several things with the power of internet with their own devices.

Deep Space Explorators: Those who can afford the advanced SE (Space Explorator - works on any energy source), which has loads of intelligent devices on board, can do so. They can control their exploratory missions on where, when and how. SEs can conduct a complete degree2degreemission on any alien planet.

Future: India has become a one stop place for all the latest trends in disruption and the world is looking up to us for the Next Big Thing. A lot of startups are working towards that "next big idea", and we never know what that next is going to be. Let us hope for a safer and tech savvy future. The days are not far behind, where family / friends can go for an outing / tour anywhere in our galaxy.

Caution: I see you are very excited after reading the blog, before you search for any keywords / topic in this blog, please read my first sentence again.

Business Intelligence in Banking

By Rama Naik

In today's highly competitive technology driven business world, where businesses and technologies change at a rapid pace, there is a need for a strong customer retention program, competitive marketing strategy and enhanced products, with tactic promotion plans. To build the strategy and enable quick decision making, one has to perform continuous analysis, measure, monitor and manage the data. Business intelligence reports play a major role as they are more advanced with the competitors and help with quick decision making.

Why BI in Banking?

As the banks expand their business geographically and grow bigger in size, they capably formed multiple branches across international markets and connected through networks. The data stored by banks in different silos and the volume of data generated became very large. Manual operation in this scenario will be extremely time consuming and has many drawbacks. Also, generic reports may not give complete insights about the business. The lack of a coherent and integrated analytical framework drove the industry on analyzing the data and build a robust intelligent system for rapid decision making.

Therefore, the need is to have the right business intelligence tool to carry out effective business problem analysis and build successful strategies which can speed up decision making to improve and expand the business. To achieve this, one needs to analyze the historical data, understand customer needs and strategize the future requirement.

What can be done through BI?

Banks can use Business Intelligence tools to analyze the historical data for strategizing and planning future growth. BI also helps in understanding the customer in a better way with their transaction patterns, their interests and satisfaction level. Historical data analysis will assist in improved budgeting, marketing, sales promotion, product insights, designing new products, customer retention programs design, customer relationship management (CRM), risk management, and meet regulatory compliance.

Customers, customer details and their transactions-related data are the key asset information to all banks. Banks have shifted their focus into Customer Relationship Management (CRM) to assess the KPIs.

One way to build good relationship with the customer is to provide accurate and precise data on time. This in turn helps in building credibility, trust and also increases business growth.

Customer analytics will provide panoramic view of customer data with more insights to:
  • Understand the landscape of the market in terms of customer group, demography, transaction pattern, choice of products and their opinions.
  • Identify customer relationships and promote additional products to existing customers
  • Know the trends, design new products, offerings.
  • Predict customer behavior and plan for customer retention and loyalty programs.
  • Use customer sentiment analysis to understand the feedback and opinions on products or offerings.
Major benefits of BI
  • Better understanding of customers' needs, transaction patterns and satisfaction levels
  • Improve the credibility and trust with customer
  • Reduce operational cost
  • Achieve regulatory compliance
  • Align and improve the sales and marketing programs
  • Increase the sales with tactic promotion plans
  • Help with quantitative data for quick and optimal decision making
  • Strategies as a part of the future plan
Conclusion

The need of the market is to have global Data Model framework to supports data from all markets, channels, including digital, mobile and social media.

Now, it is time to ensure that maximum benefits are derived by analyzing the data through the right BI tool and appropriate reports to design new strategies, quick and precise decision making.