Thursday, 11 June 2015

Essential Aspects to Secured and Protected Cloud Data Storage

By Thejaswini J

A cloud data storage system is a network of vast distributed data centers that implement cloud computing technologies such as virtualization, and provide a user-centric interface for storing all types of data.

Cloud computing has become the topmost priority of several technology-oriented organizations, and thus, data storage has easily made its way into the virtual space. Data security and privacy are more important in the virtual space at all the levels, be it IaaS (Infrastructure as a Service), SaaS (Software as a Service), or PaaS (Platform as a Service).

Let’s evaluate the scenarios of data security and privacy and make well informed judgments about the risks involved for users and organizations. When it comes to data, not all the data is of equal importance. For instance, if we consider the use of private cloud versus a public cloud, we can clearly classify between sensitive data and non-sensitive data.

We cannot deny the flexible access and storage facility that cloud computing offers! Yet, a substantial gap remains between the perspectives of the vendors and users about the transparency, privacy and security of cloud storage. Although the cloud industry defends clouds to be much more secure than what we currently use for data storage, issues pertaining to privacy, security and availability are the foremost concerns of any organization / user when it comes to cloud adoption.

Security and Privacy

Data privacy, data management and data security are the main concerns of every user as well as a cloud service provider, as more number of companies is seeking into understanding the security aspects prior to availing the cloud service. The biggest threat of losing the data is through hackers who can break into online servers and slip away with the data, and there is no way of retrieving it as it was on cloud.

Cloud security depends on various factors as represented below:


Storage of data on cloud entails to management of virtual data, physical data and the machines, which are entirely delegated to cloud service providers. Users like you, on the other hand, retain limited control over your data and virtual machines.

Laws and Legal Obligations

Depending on the country, there exist laws as well as legal provisions that every cloud service provider needs to adhere to while processing and storing your data. The law enforcements, although, may differ from one country to another. Three main parties that are involved in storing, processing and retrieving data – Cloud storage service providers, subcontractors who offer infrastructures / resources for cloud providers, and you – the cloud user who wishes to store your data ‘safely’ on cloud. Each of these parties is subject to follow the legal aspects, which concern to personal rights such as data access or security rights. Overall, it must be understood by everyone engaged in accessing, providing and availing cloud services that, abiding by the law of the land is a must. Hence, you, as a cloud user should consider the legalities involved, as you are primarily accountable for your data and its processing.

In Conclusion…

Even as cloud computing brings several advantages, data security and protecting privacy remain as major concerns, and need to be addressed as data is prone to threat, especially while sharing or retrieving information. Healthcare, bills & payments, online banking and e-commerce systems are the ones that require maximum security and protection as they store information about credit / debit cards, transactions and health data. So, the ability of today’s technology to protect and control as to what information to reveal and who can gain access to that information is a growing concern in this huge web world! So, cloud users, stay vigilant!

An Overview of Crowd Funding

By Mahesh D N

April 14th of 2013 was a monumental day in the lives of Veronica Mars’s fans. More importantly, the day is now remembered for when Crowdfunding truly became a global success story.

Crowdfunding is defined as the practice of pooling money from public sources on an online platform instead of the traditional method of loans provided by banks or financial institutions.

Business Model: A successful phenomenon hinges on a good business model in place with different stakeholders taking part and Crowdsourcing is no different. First and foremost, we have funders, who are the general public using their own means for donating money towards the project or cause. On the other hand, fundraisers are generally entrepreneurs or individuals inviting the donation of money towards their dream project. Lastly, we hav
e platform providers, which are online websites that help fundraisers in realizing their dream project. DonorsChoose, Kickstarter, Sciflies, Indiegogo, USeed, Prosper and RaisingSocialare some of the popular platform providers.

Stakeholder Analysis: A platform provider can be perceived as a website, which gives an entrepreneur the freedom to express himself explicitly and attract attention regarding the project. Crowdfunding provider also takes upon itself the freedom to use various marketing maneuvers to bring in donations for the project. Entrepreneurs research those platforms that have had varied levels of success histories and have enough word-of-mouth marketing to deem themselves as being popular. Fundraisers also put the amount they are seeking for within a stipulated time, publish the target amount and an estimated time on the platform for reaching that target, while posting their project idea. Funders are those who patronize the site on a regular basis and fund a project, which they deem to be interesting by transferring their money online.

Profitability Analysis for Funders: Funders who successfully fund a project are not just in it for giving their money away. They expect returns and are enticed by different measures. The major one among them is gifts. Under the gifts model, we have the “Keep-it-All” model that gives the fundraiser the security of keeping the money donated all to himself even if the target is not met. The “All-or-Nothing” model means the fundraiser has to return the entire money donated if it falls short. The second model is more popular as it assures investors that only a quality project will be completed. Another reward is in the form of equity, whereby funders have the option of sharing the profits of the movie while they partake no responsibility, even if the project is a failure. Finally, we have the ‘Peer-to-Peer’ lending, where funders enjoy the interest received on the money donated by them to the project after credit terms are agreed upon.

Conclusion

Crowdfunding has caught on in the last decade and is very much a modern phenomenon. It gives entrepreneurs the freedom of no longer being held hostage to banks. At the same time, it provides a platform for the general public to bring to fruition a project they long for and believe in.

Thursday, 4 June 2015

Unified Data Management Drives Value in Healthcare

By Prabakar M

Data Management in Healthcare and Need for UDM

Many establishments today are managing data in secluded silos through independent teams via numerous data management tools, to ensure quality of data, integration of data, governance of data and managing master data as well. Numerous organizations of today are looking forward for a better data management techniques like unified data management (UDM). UDM is a practice that coordinates different teams and assimilates data. This system also has some common names in different industries like enterprise data management, enterprise content management and enterprise information management.

The common challenge in the healthcare ecosystem is to work towards the betterment in the quality of patient care despite the increasing cost. At the same time, there are mounting pressures for healthcare organizations to implement the latest regulations and healthcare reforms. Considering the volume of information that is handled in a healthcare setup, like patient data, hospital data, provider data, payer data and much more, implementing a better data management practice is a need of the hour for better efficiency and decision support.

Similar to several other industries, each and every linkage in the entire healthcare system, starting from providers to payers, is contending with extraordinary amounts of data. As the data is now expanding in such a way that it includes big data, a healthcare set up requires following a unified approach to manage data. This simplifies the integration of conventionally represented structured data with other challenging forms of unstructured or semi-structured data.

Unified data management encourages advanced analytics via latest technologies that can produce better outcomes and facilitate more accurate insights based on the data. Applying UDM in large amounts of data that come from remote device management systems also enables more modified services and treatments for patients.

Benefits of UDM
  • ​Manages data more efficiently, especially in a healthcare setup.
  • Reduces costs throughout the system while ensuring strict adherence to regulatory standards and compliances. 
  • Ensures better patient care, higher patient satisfaction and improved outcomes. This also improves the administrative, clinical and financial operational efficiency in a healthcare eco-system. 
  • ​Manages data efficiently from multiple sources, such as hospital applications, payers, suppliers and intermediaries, thereby eliminating chances of inefficient data management.
Take Aways

Unified data model for healthcare helps to solve complex issues involving multifaceted data, make the most of the value of amorphous data in population management, convert insights into action via reporting and analytic data representations and respond to changing healthcare requirements. This new approach towards data management assists healthcare entities to comprehend meaningful use of their data assets.

Active Directory Federation Services: Why should you use it?

​By Chetan Kumar 


Active Directory Federation Services ( ADFS) is an identity access solution from Microsoft that provides web-based clients (internal or external) with one prompt access to one or more Internet-facing applications, when the user accounts exist in different organizations and the web applications are located in altogether a different organization. ADFS lowers the complexity of password management and guest account provisioning. It can also play a significant role for the organizations that use Software as a Service (SaaS) and Web applications. Refer to the Figure 1 below; users in Organization A use their Windows credentials to log in and ADFS authenticates access to all the approved third-party systems in Organization B.


Figure 1- ADFS and Single Sign-On in Organization A & B

Key Challenges that ADFS addresses

Prior to ADFS, many organizations used to deploy separate Active Directory for authentication and authorization for third parties in order to use their services. In majority of the cases, you could result in becoming an account administrator for external users that may expand rapidly when they need their password reset, have a new account added and so on.

The other challenge is around de-provisioning of users. You have no control over the users that leave your partners, but their accounts remain active in your AD. This may result in security incidents if an employee still has access.

Should I use ADFS?
  • If you want have the requirement to allow users from another business (Contractor/Partner) to access your internal resources (web applications, messaging services & so on). The practical example suitable to many organizations is outsourcing, where your partners/contractors access your resources for supporting your business functions. 
  • If you are planning to move some parts of your IT to private or public cloud and want all the security factors to be seamless for the users. For example: In the case of a hybrid environment, some internal users are moved to Office 365.
Key Benefits
  • ​​Single Sign On (SSO)
  1. Minimizes password phishing 
  2. Helps to minimize the need for repetitive logon exchanges 
  3. Reduces the repetition and submission of user credentials that can lead to higher helpdesk support cost and end user exhaustion
  • Industry Standard Identity protocols supported - Compatible with various security products/solutions that support the WS -* Web Services Architecture 
  • Eliminates the management of user accounts in a partner organization 
  • Extensible architecture - Provides an extensible architecture. For instance: Addition/modification of claims using custom business logic during claims processing.
Conclusion

ADFS is a very flexible technology by Microsoft that provides authentication and authorization to applications running in your environment to the extranet users of different organizations. If you are planning to extend Active Directory outside of your environment or transition to Office 365 or cloud, and want to reduce user account administration effort while providing claims-aware federation, ADFS is the perfect solution you can rely upon.

Spend Analysis

By Reddy Balaji C

Introduction

In a globalized business environment, one of the major concerns for any business is to retain the existing customers, rather than attracting new ones. The cost of migrating to a new vendor for a customer is considerably low and much easier. It has become a challenge for the banks to enhance or tailor the offerings and to extend the banking product features or to unveil the untouched business areas. Customer spend analysis is a critical success factor for any business.

Challenges in Spend Analysis

Though spend analysis is a useful activity for any company, there are number of constraints in adopting it:
  • The required information is spread across various sources like general ledger, accounts payable, bank transactions etc. So, data consolidation and bringing it to a standard format is challenging
  • Some important information like merchant information, product category may be incomplete or incorrect
Spend Analysis - Solution

Spend Analysis solution helps banks / customers to visualize a consolidated view of the complete spending behavior spread across complex, multi-level accounts across several types of offerings. It encompasses the activities starting from data collection to intelligent business information derivation, which gives in-depth insight about the spend patterns. The analytics outcome is represented in highly dynamic and interactive dashboards.

Core Areas of Spend Analysis
  • Data integration
  • Data cleansing and transformation
  • Dashboard and advanced analysis
Analysis Attributes

These are the different analytical attributes that can be used for exploration. Further, analysis is performed instantly (in-memory analytics) and the reports are generated in various formats.
  • Source for the transactions
  1. Credit Card
  2. Debit Card
  3. Online transfers
  4. ATM cash withdrawals
  5. Cash payments
  6. Internal transaction books
  • Date-time
  • Region
  • Merchant Category Code (MCC)
  • Category of item
  • Brand of the item
  • Amount spent
Key Performance Indicators (KPIs)
  • Data aggregations based on various dimensions/parameters
  • Spend patterns based on the date/season
  • Top least product/s categories sold in the given filter criteria
  • Future trend pattern based on predictive analytics
Spend Analysis – Big Data - Predictive Analytics

Since huge volume of data can arrive from different sources and its formats may vary depending on the sources, it's a challenge to follow the traditional BI approach and attain value out of the received data. To leverage huge and heterogeneous data scenarios, big data technologies like Hadoop can be adopted and analytics can be applied to derive knowledge out of the data. Since Hadoop can ingest different formats of data very easily in its file system, it becomes really simple to process this data in a distributed manner. Along with the static analytics on historic data, predictive analytics moves one step forward by calling out unique indicators for past events, which can be used to derive futuristic information.

Conclusion

Analysis of the available data and strategic decision making is imperative in large organizations. Spend analysis helps corporates to perform systematic data collection and extensive analysis on the data. Also, the predictive algorithms implemented will give futuristic projections, which will contribute in effective decision making.

Security and Privacy Issues in the IoT Realm


By Sudheer P N

The advent of 'Internet' brought in a paradigm shift in the way humans learn, communicate, work etc. Internet was a revolution and humans benefited from it in every aspect of life. Along with its success, internet also brought in major security risks. Despite having security systems in place, attackers have been successful in breaching the security and hack the systems.

As internet evolved over the decades, security mechanisms also evolved from firewall to unified threat management system (UTM), encryption and stronger authentication to wage war against intruders. As the security solution of today may not work tomorrow, the only way to be safer and smarter is to periodically update the security patch.

Like the Internet, 'Internet of things' (IoT) is gaining momentum with time. No doubt that IoT would be the next big thing in future. Using IoT, everything can be controlled and monitored remotely with ease. As per the research, IoT implementation would start from the year 2015 and reach to its pinnacle by 2020. As IoT is in its nascent state and evolving, it also poses a lot of security threats. Let's discuss some of the security risks this technology can bring:

Reasons for Security Threats
  • Connected devices
  • Lack of availability of IoT standards
Connected Devices

Any sensor, which connects to the IoT server and sends out information, will be a connected device. Smart meter is one such example. Smart meter sends water consumption information at fixed interval to the server.

Connected devices are vulnerable to various attacks in the IoT ecosystem.

Why connected devices are prone to attacks?

Connected devices are built with low power, having less memory, slow processor and run on embedded operating system (OS).
  • Embedded OS is not designed to handle security issues. As access control is not present in OS, the devices are vulnerable for attacks.
  • Due to low memory, firewall capability cannot be embedded within the device. Hence, the devices have no support to guard intruders.
  • Slow processor makes it difficult to validate and encrypt/decrypt data in real time.
Security Issues
  • Distributed Denial of Service (DDOS) attack: Enterprise network will have many connected IoT devices. Imagine, if all the compromised devices try to access a company website or try to access unavailable information from a server. This will choke the server and make it slow. These attacks cost heavily for the company.
  • Attackers gaining access to the devices can introduce viruses, BOTS, Trojan horses etc. 
  • Hackers can manipulate the data generated from devices. For example, wrong information about the trains on track would be devastating, and can lead to head on collision, costing thousands of lives.
  • Difficult to update software patches on compromised devices.
Privacy Issues
  • Compromised smart water meter would give a hint to the attacker about whether or not a house is occupied.
  • Hackers can easily manipulate sensitive personal information like health, location, bank account details etc.
IoT standards

Groups like ETSI, OneM2M and IEEE are working towards generating a standard for an IoT ecosystem. But, these standards are still evolving. Its capability can be felt only after deployments.

Solutions

Below stated solutions can resolve security concerns in the IoT space:
  • Standards should evolve fast and should be followed end-to-end.
  • Device manufacturers should come up with a device having more memory, faster processors and that which are compact in size. But, this would take time to reach the market.
  • Device should authenticate itself before sending or receiving any data.
  • Role-based access control should be built into the operating system used in devices. This would limit the component usage.
  • Devices should have deep packet inspection capability to validate the data received for any kind of attacks.
  • Authenticity of the software on the device should be verified by a cryptographically generated digital signature.
  • Devices should authenticate the regular software updates it receives from admins/operators.
​The only way that IoT can be successful is by having a security system at every component level in its eco system. As many players are investing in IoT, if all the measures as mentioned above are followed, IoT shall be a greater success than we have already imagined today.

SDN: A Transformation Milestone in the Networking World

​By Sivabalan K


Before introducing SDN (Software Defined Networking), one should look into the evolution and limitations of various computing resources, which include CPU, memory, network and storage, where everything is physical, tough to maintain, and very hard to scale. But, things started changing. Thanks to virtualization! Virtualization products like VMware, Microsoft Hyper-V, KVM etc. could virtualize CPU, memory and the storage (virtual hard disks) to a larger extent, but, the networking resources could still not be effectively virtualized. Due to this, whenever a new VM is provisioned, required networking resources have to be created beforehand, so that the VM can use that network resource to communicate with others in the network.

With the advent of cloud computing, the provisioning of the above mentioned virtualized resources like VMs were highly automated. Yet, the networking resources lagged behind in this process to a level that networking resources had to be deployed and configured manually, which required more hardware resources, and increased the energy consumption and manpower requirement as well.

Then entered a concept called Software Defined Networking. SDN is a new, emerging technology that will decouple the decision making layer - the "control plane" from the networking layer, which actually forwards the network traffic to the destination, called as "data plane", in SDN terms. The separation between the control plane and the data plane open up possibilities for network administrators to control and configure the entire network just by accessing the control plane, instead of accessing each and every node in the network. This makes management of the network seamless and simple. This also enables SDN to seamlessly integrate with various cloud-based platforms, although the SDN implementation for each cloud platform may differ.

SDN addresses the following network limitations:
  • Vendor dependence 
  • Expensive to maintain traditional networks 
  • Each network router that has its own proprietary software 
  • There is very little room for innovation as the software for the networking equipment is developed only by their vendors. 
  • Managing and reconfiguring complex networks with the addition of a new machine 
  • Difficult task for IT departments to manage it, which exposes the enterprise network to non-compliance of regulatory compliances, security breaches, and other negative consequences.
Some of the pressing needs for SDN:

The network traffic pattern changed from a traditional client server model to a state where a single application draws information from multiple databases before returning the end result to the users. With the advent of BYOD (Bring Your Own Device) for IT administrators, it is a nightmare to manage all these devices in a corporate network, and at the same time protect confidential corporate data. The rise of cloud services, both public and private cloud, also increases the requirement for SDN. Finally, there has been the emergence of big data, which needs very huge parallel processing of thousands of interconnected servers. This requires an additional capacity in those data centers with dynamic scaling capability.

SDN Architecture

Source: https://www.opennetworking.org/images/stories/downloads/sdn-resources/technical-reports/SDN-architecture-overview-1.0.pdf

A simple pictorial depiction of SDN


Traditional Computer Networks: 
  • Forward, filter, buffer, mark, rate-limit, and measure packets 
  • Track topology changes, compute routes, install forwarding rules​


Software Defined Networking (SDN)

  • ​Logically-centralized control​


Source: https://www.cs.princeton.edu/courses/archive/spring12/cos461/docs/lec24-sdn.ppt - (Slide # 11)​

Ongoing open source SDN controller projects:
  • ONOS 
  • Project Floodlight 
  • Beacon 
  • NOX/POX 
  • OpenFlow 
  • Open Daylight (controller baseline project upon which many other controllers are built) 
​Conclusion

As the world is moving towards cloud based services, the networking resources also need to be upgraded to the next level to match the new cloud based environment. SDN is the right step towards that direction, where we can have tailor-made SDN implementation for different cloud computing environments. Although it will not fulfil all the requirements of cloud computing, it serves most of them.