Wednesday, July 17, 2019

Simplifying Data Management with ONTAP

Within the is a result of last year’s 451 Research “Voice from the Enterprise” annual survey, the present enterprise IT infrastructure skills shortage was very apparent. Within their survey, 82% of respondents rated the problem in recruiting for infrastructure-based roles as either “somewhat difficult” or “very difficult”. The reason why are varied but mainly it’s the straightforward insufficient skills (69%), insufficient candidates within their region (48%) and/or high salary needs (43%).

The effects for a lot of enterprises are significant. It's understandable that infrastructure-based roles are foundational for driving innovation and business outcomes. I am not keen on the word “digital transformation”, however when we discuss the adoption of innovative technologies to redefine how companies deliver and extract value, it frequently begins with using technology for that purposeful transformation of activities, processes, competencies and models. This is correct whether we're speaking about on-premises, off-premises (public cloud, co-location, etc.) or hybrid deployments as the specific skills might be different.

It isn't just transformation priorities which are in danger. The greatest IT priorities are dedicated to data management and therefore are core towards the organization. Including managing rapid data growth, data security, data analytics, cost reduction and cloud integration.

The IT skills shortage is real and pressure from technology enhancements is constantly on the evolve in a rapid pace. It has forced IT organizations to spend less their sources, both human and infrastructure. A typical avoid IT leaders would be to “do more with less” and it is forcing their teams to integrate these foundational technologies to become simpler and much more agile.

It is also altering the set of skills of the teams. We have all seen an upswing from the IT generalist who offers general technical understanding and soft skills which enables these to be adaptable in many of labor environments and technologies. Additionally, it offsets the abilities shortage and provides organizations more versatility in the way they deploy, manage and keep IT infrastructure and knowledge management.



For This vendors, the solutions which are being developed and integrated have to complement the IT skillset and priorities for this in a manner that is easy and simple to handle while delivering immediate value. These data management solutions ought to be the foundation of the holistic data vision no matter the position of the data or even the applications that rely on it.

For NetApp, the building blocks for your data vision is ONTAP, the industry’s leading data keeper based on IDC. Whether utilized by an IT generalist or storage expert, ONTAP supplies a simple interface to safely manage data sources no matter where it resides - with immediate business value.

New versions of ONTAP are freed every 6 several weeks to make sure the latest innovations and abilities are delivered on the foreseeable schedule so that your atmosphere is definitely enhanced and guarded using the options that come with the most recent release. The most recent version, ONTAP 9.6, offers a simplified management dashboard, over-the-wire file encryption and support for automated data tiering for Google Cloud Platform and Alibaba Cloud, additionally to Microsoft Azure, AWS, and IBM Cloud Storage.

In summary, ONTAP offers a smart, effective and reliable data management platform that simplifies your It relates to:

Smart: Simplify Operations and lower Costs


  • Minimize capex and opex with leading storage efficiency.
  • Provision storage within a few minutes for Oracle, SAP, Microsoft SQL, VMware, along with other business apps.
  • Tier data towards the cloud. Instantly.


Effective: React to Altering Business Needs


  • Accelerate critical workloads with industry-leading performance.
  • Scale capacity and gratifaction without disruption.
  • Deploy enterprise applications on NetApp storage systems, commodity servers, or perhaps in the cloud.


Reliable: Safeguard and Secure Your Computer Data Over the Hybrid Cloud


  • Guard against loss of data and accelerate recovery with integrated data protection.
  • Eliminate business disruptions because of failures, maintenance, and disasters.
  • Safeguard your sensitive company and customer information with built-in data security.

Monday, July 15, 2019

Disaggregated HCI Becomes a Thing

Among the challenges of creating a cutting-edge product like NetApp® HCI is it is different from existing market definitions. We’ve had lots of discussions during the last couple of years about “what is HCI?” and “where do converged and hyperconverged infrastructures begin and finish?” These conversations were interesting, however they didn’t help customers solve their challenges.

To that particular finish, IDC has announced a brand new “disaggregated” subcategory from the HCI market in the newest Worldwide Quarterly Converged Systems Tracker.  IDC is expanding the phrase HCI to incorporate a disaggregated category with items that allow people to scale inside a non-straight line fashion, unlike traditional systems that need nodes managing a hypervisor. We’re excited this new perspective available on the market will help customers in evaluating HCI systems to deal with their business challenges.

A little bit of history on the method of simplifying infrastructure for the customers


We contacted HCI if you take a lengthy take a look at the way the industry was meeting the requirements of customers attempting to simplify their infrastructure. The HCI pioneers blazed trails along wrinkles by tightly coupling compute and storage using the hypervisor. These solutions provide the best value while serving many customer needs, however they sacrifice independent scale and enterprise-class storage functionality. We centered on these areas being an underserved gap on the market that aligned with NetApp’s specialization.



NetApp® HCI is really a hybrid cloud infrastructure that’s architected with independent compute, storage, as well as networking. This architecture enables our people to scale according to their disaggregated finish-user business demands instead of architectural limitations. Require more compute horsepower to aid an increasing Splunk implementation? Or what about vGPU functionality to aid data analytics? Donrrrt worry just scale in new compute nodes with new abilities without adding pricey storage capacity. Storage is added as fast without growing your hypervisor footprint.

With NetApp Element® software, NetApp HCI also delivers enterprise storage functionality to power your private cloud - which makes it simple to manage data at scale, enabling scale-out growth, foreseeable performance, and finish-to-finish automation. Always enabled, global deduplication and compression, combined with application-specific service quality, implies that customers can efficiently run mission-critical applications alongside consolidated virtual machines. Our disaggregated architecture prevents NetApp HCI from being a silo within our customer’s data centers.

Where will we move from here?


Towards the cloud, obviously.

We began the conversation around hybrid cloud infrastructure this past year, so we still deliver with that a part of our vision with this most the current announcement of NetApp Kubernetes Service on NetApp HCI. Effective DevOps teams require disaggregated architectures and also the versatility to develop in unpredictable directions.

IDC’s updated HCI taxonomy, additionally to adding disaggregated HCI, also leaves room for other future subcategories for containers and microservices. As IDC states, “this is really a specialized niche with big potential.” NetApp concurs.  We have seen that in conversations with this customers, and we’re taking our steps for the reason that direction with Cloud Volumes on NetApp HCI. We’re looking forward to this latest perspective around the HCI market and we’ve recognized the task to carry on to innovate and improve simplified, multicloud encounters for the customers.

Like a product manager, I love customer comments. I’d like to hear your ideas on disaggregated hyperconverged infrastructure and the way your teams are evolving to benefit from microservices.

Saturday, July 13, 2019

NetApp Wins Prestigious Awards for AI

NetApp has lately won two esteemed awards for artificial intelligence, the AIconics award for Corporate Innovation in AI and also the AI Breakthrough award to find the best AI Solution for giant Data.

The AIconics Award: Celebrating the very best in AI


Since its genesis in 2016, The AIconics has provided innovators around the world a  platform to showcase themselves for their peers and also to enterprise finish users. These esteemed global awards produce the ultimate showcase to find the best and brightest people, products, and transformational innovations.



NetApp was shortlisted for 2 awards and won the organization Innovation in AI award in a ceremony in Kensington Palace, London, on June 12, 2019.

AI Breakthrough Award: Recognizing the very best in AI Technology


The AI Breakthrough Awards recognize the very best companies, technologies, and merchandise within the artificial intelligence industry. The AI Breakthrough Awards are operated by Tech Breakthrough, a number one market intelligence and recognition platform.

There have been greater than 1,500 nominations this season, and NetApp ONTAP AI was selected because the champion of the greatest AI Solution for giant Data award within the 2019 AI Breakthrough Awards program.

These awards validate that just NetApp offers smart, effective, reliable solutions made to improve your computer data fabric, integrating across edge, core, and cloud to accelerate your trip to AI. And the world-class partner ecosystem offers complete finish-to-finish solutions that will help you achieve your proper business goals.

Thursday, July 11, 2019

Data Is the New Fuel for Innovation. Can Your Business Tap Its Power?

You cannot expect data to fuel innovation in case your organization isn’t prepared for doing things. To research why and how companies must reengineer themselves to provide around the commitment of data through digital transformation, NetApp teamed using the Wall Street Journal and analyst firm IDC to provide original research and content directed at digital transformation.

Recent findings from IDC’s global study showcase the techniques and technologies that companies are choosing to accelerate their digital transformation. The attention-opening results highlight the information-related challenges and corresponding maturity stages of organizations that thrive when compared with individuals that resist transformation.

Based on the IDC, inside the first 3 years of adoption, data-driven companies experienced:

  • A 2-fold rise in revenue growth and client satisfaction
  • A 3-fold advancement in profits and customer acquisition
  • A six-fold improvement in efficiency


IDC is releasing new information that examines how organizations in healthcare, finance, and manufacturing cope with digital transformation. Tune in to the on-demand webcast for results and also to hear the discussion.



For any discussion of real-world examples, watch as industry leaders from NetApp and Integrated Archive Solutions demystify the “digital transformation” buzz word. Analyzing the requirement for a cultural shift, WSJ talks with Well known Health about its journey to being a leader in data-driven diagnostics and pioneering the world’s largest community-based health study.

Return because the WSJ and NetApp still tell fascinating tales of enterprises transforming their companies-and also the world-with data.

An electronic transformation deep dive isn’t complete without insight and metrics from subject-matter experts and C-Suite leaders at organizations like Dow jones Johnson and IDC. These experts share one common observation: Transformation is not optional. It’s be a baseline requirement of business success.

Hear key insights from:

  • Bill Miller, NetApp CIO
  • Ramin Beheshti, Dow jones Johnson Chief Product and Technology Officer
  • Crawford Del Prete, IDC President
  • Henri Richard, NetApp EVP, WW Field and Customer Operations
  • Matt Watts, NetApp Director, Technology and Strategy


Discover the shocking truth Develop a Data-Driven Culture for guidance from more subject material experts and industry leaders about high-value metrics. Obtain the inspiration and background you have to strengthen your organization thrive with data.

Tuesday, July 9, 2019

How to Build a Data Pipeline for Autonomous Driving

This time around I wish to dig into methods to leverage the information engineering and knowledge science technologies I’ve been discussing to resolve autonomous driving challenges. I’ll explore methods for gathering data from make sure survey vehicles and the ways to build appropriate data pipelines to fulfill data needs through the process.

Autonomous Driving Challenges


A lot of companies happen to be offering more and more sophisticated advanced driver assistance systems (ADAS) as walking gemstones toward Level 4 autonomy and beyond. If you are not completely acquainted with the numerous players already competing within the self-driving space, Bloomberg includes a recent summary.

Autonomous vehicle (Audio-video) development projects face significant data challenges. Each vehicle deployed for R&D generates a mountain of information:

  • How can you produce a pipeline to maneuver data efficiently from vehicles within the field for your training cluster to coach deep neural systems?
  • How can you efficiently prepare image along with other sensor data and label (annotate) data for DNN training?
  • Just how much storage and compute will you have to train your neural systems? When your training cluster perform-premises or perhaps in the cloud?
  • How can you properly size infrastructure for the data pipelines and training clusters including storage needs, network bandwidth, and compute capacity?
  • The other data flows must you consider?


Data Pipeline for Autonomous Vehicle Development


An autonomous vehicle development program has numerous components, each with unique data management needs. The amount and variety of information creates unique challenges in most areas. This describes a few of the specific data and computing challenges in many key areas:

  • Data collection from test vehicles with full sensor suites
  • Training DNNs using labeled data produced from test vehicles
  • Simulation to check the performance of DNNs and also to create additional training data
  • Mapping to produce detailed representations of physical environments


Data Collection from Test Vehicles


Throughout the data collection process, data should be ingested from each test vehicle within the fleet. The quantity of data you really collect per vehicle will be different based on your sensor suite.

Guideline: Arrange for 1-5TB each hour per vehicle during initial training increase your plan while you receive actual results.

You might find that data collection from test vehicles falls into two phases:

  1. Initial training. If you're training DNNs on your own, you will have to collect all driving data out of your test cars.
  2. Transfer learning. When your DNNs begin to be effective, you might only want or need to gather data from situations in which the test cars don’t succeed or where safety motorists seize control.


During initial learning particular, it’s unlikely that you can transmit data from each vehicle over cellular systems due to both bandwidth limitations and price. It’s much more likely that you will keep data on every vehicle and download it periodically once the vehicle reaches a garage or depot.

This involves data storage infrastructure in each test vehicle as well as in each depot location. As the test fleet expands to various metropolitan areas, you may want to add hub locations to aggregate data for every city. Because there isn’t any single one-size-fits-all solution, NetApp offers a variety of choices to address data collection from test cars, including:

  • In-vehicle ruggedized data collector solutions
  • Storage options enhanced for garage and hub locations to allow your Audio-video operations to scale
  • NetApp cloud services for near-the-cloud as well as in-the-cloud storage to aid both cloud consume and burst-to-cloud needs
  • Data mule solutions for bulk data transport to beat network limitation


NetApp solutions scale to satisfy your capacity and that iOrTo performance needs as the program scales from petabytes to exabytes.

Aggregating Data


As information is collected from test vehicles, it’s typical to aggregate it right into a data lake, in both an information center or perhaps in the cloud (or both). An information lake typically takes the type of a Hadoop deployment with HDFS, an item store, or perhaps a file store.

An incorrectly implemented data lake may become a bottleneck as data builds up, so it’s vital that you give this consideration. Due to the quantity of data being collected, the present best practice for autonomous vehicle development programs is to achieve the data lake and training cluster on-premises, possibly with a few areas of the work within the cloud too.

Sunday, July 7, 2019

Microsoft Inspire/Ready 2019: Don’t Miss This Year’s Top 10 List

The announcement from Microsoft that Azure NetApp Files is within GA is towards the top of my list this season. This exciting announcement means much more possibilities for Microsoft Sellers and Microsoft Partners to assist customers by moving high end NFS and SMB workloads to Azure within a few minutes. Visit our NetApp Booth #824 and obtain your year off and away to an excellent start. Uncover different options that NetApp will help you drive your Microsoft business forward.

NetApp is happy to be 60-strong at Microsoft Inspire/Ready this season. It’s a large year for all of us-a large year for everybody. We’d adore you to prevent by our booth #824 to satisfy our sales and executive teams who've dedicated to making NetApp, Microsoft, and also the Microsoft partner ecosystem much more effective this season.



Starting off our Top Ten List with this year’s event is really a behind the curtain peek at why Azure NetApp Files is really a game-changer for that public cloud. But Azure NetApp Files is only the start.  This is my Top Ten Listing of why you might like to encounter us at Inspire/Ready. Let’s go!

  1. Azure NetApp Files has become in GA (General Availability). We spent near to 2 yrs in joint development with Microsoft to construct the very best new Azure service presently in market. If you are an MS Seller or perhaps an MS Work with customers who're moving or developing applications within the cloud, don’t miss the possibility to understand more about Azure NetApp Files. Come along at our booth or at our sessions. Read why Microsoft partners want Azure NetApp Files for his or her clients or just get began now and register to become onboarded for Azure NetApp Files yourself.
  2. Go ahead and take Challenge - you didn’t think we’d possess a boring booth have you? Absolutely not! We have the Spin It to Win It issue with Azure NetApp Files. It’s a timed challenge-you from the time. Uncover how rapidly and simply you are able to generate a high-performing, mission-critical, tier-one enterprise atmosphere using Azure NetApp Files (yes, you’ll possess a coach to inform you the ropes). Winners and prizes is going to be announced daily.
  3. Very important personel Conferences they are driving Your Company Forward - Inspire/Ready is all about meeting with the proper partners to organize out a effective year. We are sitting lower with MS Sellers and MS Partners to pre-plan joint plans for that year. Uncover how Azure NetApp Files will augment your company, improve your revenues, which help your clients meet their Azure Financial Commit. Learn to be eligible for a Azure NetApp Files marketing programs. Target customer acquisitions having a joint account planning session. Our conferences are made to produce quantifiable business outcomes for the two of us. Spaces are restricted, so request your Very important personel meeting here.
  4. Very important personel Celebration before Queen Concert - There exists a Very important personel celebration event happening right before all of us rock to Queen. “We Will Rock Your Azure (Quota)” is definitely an invitation only event for MS Sellers, NetApp Partners, or MS System Integrators moving workloads into Azure. Plan a Very important personel meeting to obtain your exclusive invitation.
  5. Three Theatre Sessions - Azure NetApp Files is featured in 2 sessions: “Expand SAP HANA on Azure Deployments with Azure NetApp Files” and “Migrate More Workloads to Azure Faster with Azure NetApp Files”. We’re also featured within the Tech Data session “Azure Automation Done Affordably with Cloud Solutions Factory”. Register in your Inspire Session Scheduler.
  6. Internet Marketing & Your Individual Brand: Panel Session - Would like to learn the way your personal brand may influence and convert buyers for both you and your company? Join NetApp’s VP of Cloud Marketing, Jennifer Meyer, LinkedIn executive Jacqueline Johnson, and Amy Protexter, VP Marketing at Insight on the panel located by Gail Mercer-MacKay for that latest trends and actionable insights you are able to implement they are driving value out of your social brand. Register in your Inspire Session Scheduler.
  7. The WIT Network Happy Hour - NetApp is delighted is the sponsor from the annual “The WIT Network” Happy Hour on Monday, This summer 15th at 5:00pm locally Zone Hub. Come along for champagne as well as networking.
  8. NetApp Small-theater Sessions - Uncover how NetApp will help you increase your Microsoft cloud business. We are running sessions every half an hour within our booth to supply an introduction to our Azure Marketplace choices in addition to a deep dive into Azure NetApp Files and how it's a cloud game-changer.
  9. Booth Giveaways - After each small-theatre session we are raffling off a small JBL speaker. Find out more and win too!
  10. Kubernetes Now - NetApp Kubernetes Services are a Kubernetes-as-a-Service offering for creating Kubernetes clusters which are scalable, ready for production, and simple to handle from one group of controls. In only three clicks, you are able to give your NetApp Kubernetes Service and consume persistent storage sources when needed.

Friday, July 5, 2019

The Season for the Data Fabric

First, just a little history.

I’m of sufficient age to keep in mind when Dave Hitz woke up on stage at NetApp Insight and introduced this latest term: Data Fabric. It was not an item, there have been no deliverables, however it would be a philosophy that NetApp would live and eat in the introduction of its new and existing products.

Many of us were really like, “Cool, but…huh?”

He stated that many new workloads would be cloud-based (although not all), which while it’s quite simple to deploy and destroy workload instances within the cloud, individuals workloads are useless unless of course they've relatively local accessibility datasets needed to attain business outcomes.

In my opinion the entire year was 2014. Kubernetes had either just emerge or involved to be sold. The idea of the “service mesh” had not recognized. But any doubts concerning the cloud being production-ready have been clearly vanquished as AWS and Azure had already developed into behemoths, with every presenting new releases apparently every single day.

For any couple of years following this announcement, it appeared that “Data Fabric” would be this overall term that fell in to the group of “marketecture”-only a awesome term without any real meaning or implementation.

This phase led to 2018 when NetApp reorganized into three sections to be able to realize the vision of information Fabric. The development of a cloud software unit, headed by Anthony Lye, place a sharp concentrate on using cloud and DevOps methodologies to enhance the attempted-and-true technologies that NetApp had perfected over twenty five years. NetApp was transformed from the storage company to some data services company.

So what exactly is the information Fabric now?


NetApp has produced a foundational delivery architecture for workloads as well as their data. This really is unique, as everybody else in the market concentrates on either. Customers can provision, manage, and run production, development, or test applications instances within the place which makes probably the most sense in those days. It has a significant positive effect on an information-driven database integration and execution workflow, as organizations look assessing their “use-as-you-need” compute farms. It was never more apparent than a week ago when NetApp announced a number of updates across its portfolio. I won’t dive into all of them here, however, you should read Matt Watts’ recent blog for any full breakdown.



Considering that, based on IDC, the quantity of data stored globally will grow from ~40ZB in 2019 to 175ZB in 2025, with 49% of this data kept in an open cloud, it’s obvious that a couple of things are true: 1) there’s likely to be a lot of data within the cloud, and a pair of) there’s likely to be a lot of data still resident in data centers. The majority of this latest data will reside on NFS or S3-compatible object storage, that are most suitable for multi-node compute farms to make use of. These datasets will contain millions/billions (or even more?) of files (or objects), with capacities already exceeding the petabyte range. Moving datasets of this sort around by checking filesystems just isn't possible.

Fundamentally from the NetApp Data Fabric lies NetApp SnapMirror technology. SnapMirror enables you to definitely efficiently move data around in a manner that makes the amount of files irrelevant, without resorting to third-party replication software or appliances that introduce high rates of failure as well as greater skill needs for administration.

NetApp redeveloped SnapMirror at the outset of the information Fabric movement to spread out up with other platforms for example S3  and also the SolidFire Platform to grow the information Fabric to as numerous use cases as you possibly can.

What's thrilling now would be that the cloud software unit has created really helpful and production-ready technology that piggy-backs on NetApp’s Data Fabric achievements. One of these simple may be the NetApp Kubernetes Service (NKS), which automates the very best-practices deployment of K8S clusters with ready-to-consume apps wherever you would like them: on-premises, or perhaps in the cloud of your liking. You may also tear one lower and recreate it in another location, and NKS will automate the movement from the data in the old spot to the brand new place.

I’ve personally been involved in projects where NetApp Cloud Volumes ONTAP has permitted my people to achieve considerably faster analytics results using plenty of ephemeral cloud compute, leveraging data that resides mainly on-premises, and using the Data Fabric to obtain that data in to the cloud. The client remains towards the top of the meals chain, instead of customers who get disrupted simply because they still hang on to the standard (read: slow and frustrating) 100% on-premises approach to application delivery.

In case your organization is searching to attain new or faster data-driven outcomes, it's vital that you choose a foundational architecture that does not only will get and keeps that dynamic data within the places where you will be achieving individuals outcomes, brings your scaled applications to deal with with that data to understand true acceleration. Should you choose your quest, you’ll discover that NetApp has brought within this space in the onset, and it is to date ahead in the abilities that you will wish to grab to the NetApp Data Fabric, hold tight, and prepare for any wild ride.