Must-know: An overview of "big data" (Part 4 of 8)
What is virtualization?
Virtualization is the concept that turns physical computer systems, like servers and applications, into multiple systems. “Big data” analysis requires large data set analysis and the creation of advanced analytic algorithms that are designed to identify insights, patterns, and trends that aren’t understood yet. The advanced analytics requires lots of processing power and memory. By adopting virtualization, software frameworks become more efficient. The hardware component of computers, including the random access memory (or RAM), central processing unit (or CPU), hard drive, and network controller can be virtualized into a series of virtual machines. Each of them runs its own applications and operating system.
The previous chart shows the market share of leading virtualization vendors in the market. According to Data Center and Readers’ Choice 2013, VMware (VMW) leads the market with its 47% market share, followed by Microsoft’s (MSFT) Hyper-V virtualization platform share at 24%, and Citrix XenServer (CTXS) at 8% market share. IBM (IBM) and HP (HPQ) are other leading vendors in this market. VMware leads the bring our own device (or BYOD) revolution. BYOD lets employees access corporate information through their own devices. The current time workforce should be in contact on a regular basis. It’s View Mobile Secure Desktop offers a safe and secure environment for mobile virtualization.
Virtualization’s importance for “big data”
Virtualization has three distinct characteristics to support the scalability and operating efficiency required for the “big data” climate.
- Applications and operating systems are supported in a single physical system by partitioning the available resources in virtualization. As a result, each partition has its own memory and independent operating system.
- Each virtual machine works in isolation from its physical host system and other virtualized machines. If one virtual instance crashes, the other virtual machines and the host system aren’t affected by the crash. In addition, the data isn’t shared between one virtual instance and another.
- Encapsulation – a virtual machine can be represented as a single file. It helps in identifying it based on the services it provides. By encapsulating an entire system, a virtual machine becomes compatible with all standard operating systems and applications.
Virtualization of servers, storage, applications, processor, memory, and network ensures effective sharing of resources. For instance, using server virtualization, processing power utilization increases by ~65%. Virtualization isn’t considered a technical requirement of “big data.” However, software frameworks seem to display more efficiency in a virtualized environment.
Browse this series on Market Realist:
- Part 1 - Overview: What is “big data”?
- Part 2 - Why “services” is the integral segment of “big data”?
- Part 3 - Social, mobile, analytics, and cloud is driving “big data” growth
- Personal Investing Ideas & Strategies