Archives

Posts Tagged ‘big data’

Understanding SAP HANA

Posted on: October 14th, 2017 by Daniella Lundsberg

SAP HANA is an in-memory column-oriented relational database management system that combines database with application processing along with integration services. Available for both on-premise and cloud deployment, SAP HANA makes big data, real-time analytics and aggregation, powerful data processing, data management and data warehousing possible on a single platform. SAP HANA notably out-performs traditional databases, thanks to its ability to keep vast volumes of information in memory, or physically closer to the CPU, and instantly accessible. So you can monitor and analyze incoming data the second it enters the system.

Since the launch of SAP HANA’s flagship product in late 2010, it’s already proven itself for leading organizations from John Deere to eBay and Burberry. Companies are finding that it delivers on its promise to provide a simpler, faster, and more cost-efficient IT architecture capable of presenting business insight in real-time. So much so that customers are putting their entire business suite on HANA and are conducting complex analytical projects with remarkable ease.

SAP HANA employs modern hardware and database systems to combine online transaction with online analytical processing – offered within one single source. As a result, the platform supplies powerful real-time analytics while also simplifying application and database structures. HANA improves reporting and transactional processes, enabling rapid real-time processing of massive amounts of data required to support today’s most innovative Big Data applications.

HANA further eliminates Disk I/O operations along with the creation and storage of indexes and aggregates. By reducing the complexity of coding, administration, and infrastructure, it allows for simpler design with fewer objects and operations. Organizations using SAP HANA are able to shrink their storage footprint, eliminate the need for multiple layers of software and hardware, and improve performance while reducing their power consumption and total cost of ownership. Put simply, SAP HANA simplifies your technology stack.

Companies considering HANA have two avenues for implementation. The first is a Full/Enterprise Use license, with cost based on database or gigabyte size. This is ideal for companies looking into Big Data, agile datamarting, operational reporting, custom application design, side-car accelerators to the SAP Business Suite, and side-car Operational Reporting using customized or pre-delivered HANA live content. Shifting to the HANA database also provides the ability to access new HANA-based applications and tools including Simple Finance and HANA-Live.

SAP HANA Live, just one application included in the SAP Suite, delivers real-time operational reporting based on customizable virtual data models. This real-time reporting is achievable without negatively affecting performance. Dynamic tiering provides a smart and cost-efficient means for data archival and the management of massive amounts of data from multiple sources. SAP HANA’s powerful Big Data and predictive analytics capabilities also give customers the ability to build analytics solutions that provide unprecedented up-to-the-second operational visibility, forecasting, and analysis.

In addition to Full/Enterprise Use license, SAP HANA may be installed under its Run-Time License, with actual cost based on SAP application value. The Run-Time License has no gigabyte restriction. This option is recommended for companies that don’t require Big Data but, instead, simply want to replace and upgrade their existing database to prepare for future growth and to take advantage of its many benefits and innovations. The Run-Time License is also suitable for organizations interested in side-car accelerators to the SAP Business Suite.

 

 

 

 

Should you migrate to SAP HANA?

Posted on: August 18th, 2017 by Daniella Lundsberg

Faster is better. This is especially true when it comes to accessing business data, which can now be made available in real-time thanks to SAP HANA. Beyond speed, SAP HANA wraps data into reports and analytics, delivers it to your desktop, and uses it to provide relevant offers to customers and users. This level of speed and transparency helps companies innovate, mitigate risk, and make all of those critical decisions that drive operations each day.

Real-time data availability is the primary benefit driving growth and demand for SAP HANA, a database and applications platform developed by SAP SE. SAP HANA, a High-Performance Analytic Appliance, is now the fastest-growing technology in SAP history due to its revolutionary in-memory platform. It can be used for any kind of application, capable of processing both structured and unstructured data instantaneously.

SAP HANA is an in-memory column-oriented relational database management system that combines database with application processing along with integration services. Available for both on-premise and cloud deployment, SAP HANA makes big data, real-time analytics and aggregation, powerful data processing, data management and data warehousing possible on a single platform. SAP HANA notably out-performs traditional databases, thanks to its ability to keep vast volumes of information in memory, or physically closer to the CPU, and instantly accessible. So you can monitor and analyze incoming data the second it enters the system.

How fast is fast? While actual speed is dependent on the data model and other factors, SAP HANA is shown to perform between 1,000 to 10,000 times faster than a traditional database platform. This is due to its in-memory design, which overcomes challenges naturally inherent in on-disk systems including bottlenecks between CPU and disk. To drive performance, SAP HANA leverages multi-core processors, low-cost main memory or RAM, and solid state drives.

SAP HANA stores data in columnar structures which enables data compression making entire data retrieval process even faster. When a traditional database system tries to improve I/O performance to these levels, it can only do so by heavy memory consumption and CPU cycles. So users are forced to choose between, for example, a broad and deep analysis or high-speed performance. Traditional databases also use much more memory, caching, and CPU cycles to maintain the cache on top.

But SAP HANA’s in-memory platform eliminates disk I/O, reduces memory consumption, and minimizes data redundancy along with the associated need for more storage space. Superior performance is backed by optimized compression, columnar RDBMS table storage, persistent data and log storage, partitioning, massive parallel processing, and ACID compliance. ACID (Atomicity, Consistency, Isolation, and Durability) compliance ensures transaction reliability, while partitioning supports the deployment of massive tables – dividing them into smaller segments that may be placed on multiple machines.

As a result, HANA makes computing faster, easier, and more efficient. It makes broad and deep analyses possible along with real-time availability, high-speed response, and powerful reporting capabilities. Code, application development, and set-up are simplified, with end users able to enjoy a richer and more relevant experience. Finally, SAP HANA streamlines a company’s data footprint making processing and operations more efficient than ever.

Contact American Digital to discuss whether or not SAP HANA is right for your organization.

Top 5 Reasons to Revisit Healthcare Analytics Today

Posted on: June 16th, 2016 by Daniella Lundsberg

With so much talk about Big Data today, it’s easy to forget that it continues to be a disruptive, revolutionizing force in healthcare. Technologists keep finding new ways to extract data from unlikely sources – videos, sensors, images, text documents – which data scientists then use to create more sophisticated models and discover new insights.

Smart healthcare groups today are leveraging these advances to solve key challenges throughout the organization:

 

  • Optimizing Patient Outcomes

 

    • Big Data is having a huge impact on clinical research as a whole, allowing investigators to mine patient data from around the globe as they look for cures to cancer, diseases, viruses and more. However, individual providers can also use analytics to study the factors involved in hospital-acquired conditions within their own population. This will help reduce the number of post-op infections, determine which device types (catheters, stents, implants) are more successful, find ways to limit the spread of infectious disease and more.
  • Implementing Workflow Improvements

 

    • Today’s healthcare providers are going beyond scheduling data to find ways to increase staff productivity, limit costs and reduce patient wait times. They’re analyzing every piece of data across the organization – information from the mobile devices physicians carry as they move from patient to patient, the sensors embedded in monitoring devices, dispensary applications, lab equipment, even surveillance footage – to see where improvements can be made.
  • Optimizing Cost and Reimbursement Models

 

    • In 2013 alone, hospitals were underpaid for medical services by $51 billion. Clearly, healthcare providers need to break down the barriers between siloed data stores in order to analyze all the factors that impact care – including revisits and testing protocols – to determine true cost of service. Being able to map that information to payer reimbursements rules will help organizations find a way to reduce growing financial deficits.
  • Pioneering Preventative Medicine

 

    • From wearables for tracking daily wellness habits to genetic testing for understanding predisposal to specific conditions, smart providers are using a wealth of individual and public data to create personalized medical plans to keep their patients healthier over the long run.
  • Boosting Patient and Community Engagement 

 

  • Extending care delivery throughout the community means offering services and facilities tailored to an area’s unique requirements. Whether an organization is looking to grow through acquisition or expansion, planning teams need deep knowledge on population, health factors, care preferences and more. Mining patient information alongside public health and social media data will allow healthcare organizations to keep pace with evolving patient and community trends.

It takes a powerful yet flexible data storage and analytics architecture to continuously capitalize on the advances occurring in Big Data today. At American Digital, we start by optimizing structured data sets and increasing efficiency with next-gen storage technology such as HPE 3PAR. To help companies dig deeper and speed results, our analysts and data scientists rely on HPE’s purpose-built analytics platform equipped with natural language processing and machine learning to draw real-time insights from all available information sources – internal and external, structured and unstructured.

Becoming data driven is one of the best strategies for ensuring relevance in today’s evolving healthcare space. Let us know, in the space below, how you’re using Big Data to drive change within your organization

Using Big Data as a Strategic Asset

Posted on: September 23rd, 2015 by Daniella Lundsberg

Big data is everywhere. In today’s internet of things (IoT), big data is reinventing traditional IT and changing the way we use, share and interpret information. For organizations looking to scale, big data is one of the keys to growth, innovation and most importantly, insight.

According to a recent study by Forrester Research, it is estimated that most companies are only analyzing around 12% of their current data. As organizations begin to make sense of big data, ignoring 88% of data-driven insights can lead to higher IT infrastructure costs, higher organizational costs and operational inefficiencies.

 

Using Big Data as a Strategic Asset

As companies collect, store and process vast amounts of data, organizations need to optimize their data by converting it into predictive analytics and actionable items. According to the IDC, big data technologies and architectures are designed to economically extract value from very large volumes of data by enabling high-velocity capture, discovery, and analysis. These technological shifts are reshaping the IT landscape and helping organizations solve day-to-day business challenges across the entire IT lifecycle.

Understanding and interpreting real-time data can lead to a better understanding of today’s savvy customer. As organizations gain greater insights into their customer’s behavior, they will also maintain a stronger advantage over the competition. Here are some of the ways companies are using big data to better understand their customers:

  • Creating in-depth customer solutions based on customer profiles
  • Click-stream analysis of real-time offers to online customers
  • Monitoring social media for sentiment analysis and applying real-time responses
  • Better understanding of the lifetime value of customers.

 

Big data. Big world. The Future of Big Data

Big data is not only about big business. This technological revolution will ignite fundamental changes across industries and organizations. What will the future of big data look like? Big data will revolutionize our lives by 2020 and change access to higher education, employment opportunities, social media trends, surveillance footage, entertainment and even the outcome of presidential elections.

How American Digital can help you make sense of big data?
American Digital is a Platinum HP partner certified to support and implement Big Data Solutions including HP’s Vertica. The versatile Vertica offering allows clients access to an analytics platform that is designed to exploit a wide variety of data while enabling them to accelerate business value from simplified reporting and analysis processes.

 

Cloudera, Hortonworks, and MapR: Comparing The Top Three Hadoop Distributions

Posted on: August 14th, 2015 by Daniella Lundsberg

As leading companies look for easier and more efficient ways to analyze and use the massive amounts of disparate data at their disposal, Apache Hadoop rises to the occasion. Hadoop is a powerful software framework that makes it possible to process large data sets, doing so across clusters of computers. This design makes it easy to quickly scale-up from a single server to thousands. With data sets distributed across commodity servers, companies can get up and running fairly economically and without the need of high-end hardware. What makes Hadoop even more attractive is the fact that it’s open source.

 

But Apache’s standard open source software is far from an out-of-box solution, with several restrictions and developments required to make it enterprise-ready. Hadoop excels at running complex analytics against massive volumes of data. But, as a batch-and-load system, it lags in its ability to run near real-time analytic queries. It also lags when it comes to streamlined data management and data governance. Luckily its adaptable, modular architecture make it relatively easy to add new enhancements and functionalities.

 

As a natural evolution, a number of companies have stepped in to build on Hadoop’s framework to make it enterprise-ready. They’ve adjusted its code and bundled it together with sleek, user-friendly management tools and installers along with related technologies of their own, routine system updates, user training, and technical support. The most recognized of these Hadoop distributions are Cloudera, Hortonworks, and MapR.

 

Cloudera

 

Cloudera Inc. is one of the oldest and most widely known Hadoop distributions touting the strongest client base and market penetration. Cloudera was founded in 2008 by leaders in the big data industry from companies like Google, Facebook, and Oracle. Cloudera offers both its open source distribution, called Cloudera Distribution for Hadoop (CDH), and its proprietary Cloudera Management Suite. The company leverages its open-source distribution by offering paid support and services. To differentiate itself, Cloudera also provides proprietary value-added components.

 

Setting Cloudera apart is its proprietary Management Suite, which includes sought-after features like wizard-based deployment, dashboard management, and a resource management module to simplify capacity and expansion planning. Cloudera’s long-term objective, says the company, is to become an enterprise data hub, which reduces the data warehouse need for companies that depend on it. Largely Cloudera is open source with just a few proprietary components, with its open source CDH distribution running on a Windows server. This benefits users looking to minimize the risk of vendor lock-in and protects the ability to switch to a different Hadoop distribution at a later date with relative ease. Cloudera users include recognized brands like Groupon.

 

Hortonworks

 

Hortonworks is a newer player on the market founded in 2011 as an independent company spun-off from Yahoo, which maintains the Hadoop infrastructure in-house. Hortonworks focuses solely on providing an open source platform and is the only commercial vendor to do so, with MapR offering only a proprietary distribution and Cloudera offering both proprietary and open source. Its primary offering, Hortonworks Data Platform (HDP), is built upon Apache Hadoop and is enterprise-ready complete with training and other support services.

 

Setting Hortonworks apart is the fact that it is a completely open enterprise data platform that’s free to use. This could lead to much faster improvements and updates. Its HDP2.0 distribution may be downloaded directly from their website and easily installed. Because Hortonworks is open source, it can be integrated faster and easier. Hortonworks is currently in use by Ebay, Bloomberg, Spotify, and Samsung Electronics.

 

MapR

 

MapR provides a complete Hadoop distribution, though not based on Apache Hadoop itself, taking a notably different approach than Cloudera and Hortonworks. MapR has made Hadoop enterprise-grade by adding its own IP and enhancements that make it faster, more dependable, and more user friendly. Having altered the file system, MapR is offered solely as a proprietary solution. Additional functionality may be added using Apache’s open source Drill, Spark, and Solr. The company has bundled its solution with supplementary services including training and technical support.

 

Setting MapR apart is its ease of use, enterprise-grade features, and reliability. The company also claims to be the only distribution offering full data protection with no single points of failure. The proprietary MapRFS file system is more production-ready, with implementation differing slightly from its counterparts due to the fact that it is written not in Java but, instead, in C. MapR is a complete distribution that includes Pig, Sqoop, and Hive with no Java dependencies, independent of the Apache Software Foundation. It’s currently in use by leading companies including Cisco, Boeing, and Ancestry.com.

 

Choosing The Right Distribution

 

How much importance does your company place on technical support, expanded functionality, and system dependability. Are you looking to embrace the flexibility of open source to mitigate the risk of vendor lock-in, or does your company need a solution that can make a rapid impact on business and overall profitability?

 

Though similar in several ways, each vendor has its own strengths and weaknesses. When choosing the distribution that’s right for your organization, consider the added value offered by each option while balancing cost and risk. Companies will also want to weigh performance, scalability, reliability, data access, and manageability with both their short- and long-term goals.

 

American Digital- We make big data meaningful.

 

All of the records and files and facts and figures you’ve amassed over decades offer tremendous value in the form of new revenue and business opportunities. To unlock that value, though, businesses need an advanced and scalable technology solution.

American Digital helps organizations tap into the value of their big data assets, optimizing data and converting it into actionable real-time reports and analytics accessible through one administrative dashboard that’s viewable on any PC or mobile device. We work with all industries – from healthcare organizations that constantly update patient records to online retailers tracking ecommerce orders and social media reviews. Our solutions provide the means to easily collect infinite amounts of data by the minute and optimize it for real-time analysis. Get a complete picture, with essential insight gleaned from existing data at rest and data in motion.

American Digital manages the entire process – from planning through solutions design, implementation, and governance. Shift from a business intelligence to a big-data focused organization supported by a scalable solution able to unite disparate data formats and types. Improve decision-making, quickly identify business trends, and mitigate risk with a richer and more interactive analytics environment.

The Anatomy of a Big Data Solution for Enterprise

Posted on: July 13th, 2015 by Daniella Lundsberg

The entire point of Big Data is to unlock the value that will drive better decision-making, higher operational efficiencies, customer loyalty, behavioral insight, and a host of other business outcomes that can positively impact your organization’s bottom line. Getting there requires an infrastructure that can collect, store, access, analyze and manage all the various forms of data inside your servers, or in the cloud, and allow you to convert it into actionable intelligence. And, by the way, it needs to integrate into your existing environment.

Hardware, Software, Platforms

IT infrastructure has evolved over decades from mainframes that handled yesteryear’s version of high-volume transactions, to online transactional processing (OLTP) databases –that became widely accessible in the form of CRM, ERP and e-commerce systems, to data warehouses that incorporated all of this transactional data with software for analytical insight – the rise of Business Intelligence (BI). Today, the evolution continues with HP ConvergedSystems bringing together compute, storage and networking, (including HP ConvergedSystem for Big Data optimized for SAP HANA), and HP Vertica Big Data Analytics Platform.

HP Vertica is a standards-based relational database that supports SQL, JDBC/ODBC and tightly integrates all popular BI and visualization tools. It can handle SQL and Big Data analytic workloads at 30 percent of the cost of traditional data warehouse solutions. It runs queries 50-1,000x faster, boasts petabyte-scale storage up to 30x more data per server, and offers the openness and simplicity to use any BI/ETL tools and Hadoop. Organizations can use HP Vertica to manage and analyze massive volumes of data quickly and reliably without the limits or compromises of traditional enterprise data warehouses.

Healthcare industry innovators, business collaboration platform providers, multimedia entertainment companies, and mobile game developers are among the legions of HP Vertica believers. (See for yourself.)

How About Hadoop?

The HP Vertica Analytics Platform and Hadoop are highly complementary systems for Big Data analytics. While HP Vertica is ideal for interactive, blazing-fast analytics, Hadoop is well suited for batch-oriented data processing and low-cost data storage. When used together organizations benefit from the synergies of the most powerful set of data analytics capabilities, extracting significantly higher levels of value from massive amounts of structured, unstructured, and semi-structured data.

Avoid Frankenstein’s Monster

Every organization’s technology environment and business requirements are unique, giving rise to the need for tailored solutions. Simply bolting parts onto your existing environment could cause the villagers to revolt. Before you embark on your Big Data quest, consult with a partner like American Digital to understand what embodies a successful enterprise implementation.

 

Join us in Chicago on July 23 at our Big Data Symposium. Meet tech execs from HP, SAP, Hortonworks, and Big Data guru & Fortune 50 consultant, Chris Surdak. Register here.

How Do You Know When It’s Time To Tackle Big Data?

Posted on: July 9th, 2015 by Daniella Lundsberg

Organizations ranging from healthcare, to education, to manufacturing, and Fortune 1000 companies across a wide swath of industries are candidates for Big Data implementations. Now is the time for information technology and line of business leaders to come together to understand when and how to tackle Big Data.

What exactly is Big Data?

Big Data is characterized by the four “Vs”:

Volume – the vast amount of data that is generated every second (This infographic1  illustrates where all this data comes from.)

Variety – the different forms of data, both structured–like patient health records, and unstructured–like your fitness tracker’s stats

Velocity – the speed at which data is generated and the speed at which data moves around– think skateboarding-cat viral videos

Veracity – the trustworthiness of data, especially that of unstructured data, from social media, for example

You can, and should, also add a fifth “V”:

Value – The “V” that matters most is the ability to turn all that data into business value.

How can I extract value from my Big Data?

More than ever, companies are trying to understand how Big Data can help their organizations operate more efficiently and better serve their customers. It is vitally important to determine the requirements of each line of business and develop use cases that illustrate real-world scenarios. For instance, an industry use case shows how healthcare and life sciences companies can use the HP Big Data Platform to improve patient analytics through multiple aspects of operations. The value is seen in many areas:

  • Patient outcomes can be improved by using analytics to prevent complications, increase the effectiveness of treatments, and manage predictive care.
  • Organizations can generate all the metrics they need at a moment’s notice to stay compliant with healthcare reform mandates.
  • Deep insights from real-time analyses of clinical data can help inform medical researchers.

(See use cases for Financial Services, Public Sector and other industries here.)

Where do I start?

Before you undertake a Big Data initiative, consider what kind of business value you want to derive and consult with an expert, like American Digital, who can help your organization tap into the value of your Big Data assets. We provide everything you need to profit from Big Data from assessments to strategic planning and use case development. We work with top technology partners like HP, SAP, and Hortonworks to provide custom Big Data & Analytics solutions from design to implementation and governance. As a Platinum HP partner, we are certified to support and implement Big Data solutions including HP’s Vertica, a versatile offering that allows clients access to an analytics platform that is designed to exploit a wide variety of data while enabling them to accelerate business value from simplified reporting and analysis processes.

Find out if it’s time for your organization to tackle Big Data. We’re here to help.

If you’re in Chicago on July 23, join us for our free 2015 American Digital Big Data Symposium. Tech execs from HP, SAP and Hortonworks will be in attendance. Plus, you can meet Big Data guru, Fortune 50 consultant, and rocket scientist, Chris Surdak. Register here.

1  Data Never Sleeps 2.0, DOMO

 

 

 

HP IDOL and Facial Recognition Technologies Unite To Enhance Public Safety

Posted on: June 24th, 2015 by Daniella Lundsberg

The security landscape is changing and, with it, highly secure organizations from the military to law enforcement are benefitting from behavioral recognition technology including facial recognition. Meanwhile, the explosion of big data puts a wealth of real-time information at our fingertips, giving us the ability to analyze data rapidly and expedite action. When we bring these innovations together – uniting multiple disparate data sources from various real-time surveillance videos, photos, and audio files – we can connect the dots in mere seconds. As such, we could potentially identify fraud, criminal suspects – even terrorists at a live public event – faster and easier than ever before.

Leading this space is HP Intelligent Data Operating Layer (IDOL), which helps organizations pool together a multitude of data sources to rapidly locate relevant information, analyze that information, and act on it immediately. And IDOL provides this capability out of the box – connecting with outside information without the need for third party add-ons. With powerful extensibility, HP IDOL supports searches within massive video, photo, or audio libraries or feeds.

Even if it’s stored securely behind various user privileges and requirements, HP IDOL has brainpower to discover and index data across all of those secured sources and determine relevance in seconds. It then presents actionable analytics on one dashboard with the speed and agility required to support critical decisions.

“We’ve helped government agencies employ HP IDOL to use their data intelligence for improved efficiencies and expedited decision-making, dramatically improving public safety,” explained an American Digital consultant.

Popular applications include:

Rapid mapping of an individual’s recorded interactions
Scene recreations using various photos and videos
Personnel screenings for enhance base security
Real-time facial surveillance and monitoring for secure events
Facial features matching with security clearance databases
Expedited security response and criminal case resolution

Is IDOL the right fit for your organization, please contact your Big Data Consultant to learn more.

Measuring the Reputation Cost of Data Breach

Posted on: May 19th, 2015 by Daniella Lundsberg

The security landscape is constantly evolving, with businesses today fighting a cybercrime ecosystem that encompasses global players. Hackers now invest as much – if not more – toward exposing vulnerabilities as companies do in securing them. Assuming the right solutions are in place, most enterprises can stay under the radar and avoid a major breach. But there’s never a guarantee, and recent exposures at companies like Home Depot, Sony, and Target offer valuable insight that can help us all plan for and better comprehend the magnitude of loss potential.

The initial breach primarily affects the consumer, who faces bank-imposed limits and time-consuming card cancellations. For the business itself, along with that data loss, one of the greatest risks is long-term damage to the brand’s reputation. A trusted and established reputation can take decades to build – and mere seconds to destroy. Security breaches force companies to invest heavily in resources aimed at salvaging employee morale, stock valuation, consumer trust and loyalty. When customers are afraid to transact with a business, this also naturally puts a strain on traffic and revenue. For companies that survive a scandal, the fallout and ramifications can still take months to years to reconcile.

The extent of the damage and the time it takes to reinvigorate a brand’s reputation is largely dependent on the breadth of exposure and the manner in which the crisis is handled. Data breaches are one of three occurrences to have the greatest impact on brand reputation, according to a survey conducted by Ponemon Institute and sponsored by Experian’s Data Breach Resolution unit entitled “The Aftermath of a Mega Data Breach: Consumer Sentiment”. In this survey, data breach was ranked up with environmental disasters and poor customer service.

When evaluating potential risk, planning for crisis resolution, and assessing the cost of an enterprise security solution, tangible assets alone aren’t enough. Decision makers must also estimate the monetary value of, and the earning potential in, their brand reputation. They need to ask themselves: “How much is our brand reputation worth?”

Ready to for a security assessment? Contact the American Digital security team today.

Contact Us

Learn More About Us

Share