2014: Tableau Sees Big Data Ramp Ups Will Get Easier and Faster
Tableau Software execs say 2014 will be the year when big data projects get faster and easier – both for IT and business users. Many trends are in place to better enable non-technical users to leverage diverse company data to discover insights and even predict future trends. IDN speaks with Tableau’s Elissa Fink about the top trends.
by Vance McCarthy
Chief Marketing Officer
"We see the end of the era of the domination of the data scientist. Data science will move from the specialist to the everyman."
Tableau Software execs say 2014 will be the year when big data projects get faster and easier – both for IT and business users. Many trends are in place to better enable non-technical users to leverage diverse company data to discover insights and even predict future trends.
IDN speaks with Tableau’s Chief Marketing Officer Elissa Fink about several eye-catching changes coming to big data and analytics this year. Many of these trends will be powered by advances in skills and open technologies for data integration, data visualization and API-driven data services.
“The great thing is that with all these innovations companies will be able to use best of breed, so if they want to find new and different ways to leverage data they can and they will,” Fink said.
Top trends in big data and delivering of insights to business users include:
Trend #1: The end of data scientists.
“We see the end of the era of the domination of the data scientist. Data science will move from the specialist to the everyman,” Fink told IDN. “There is always going to be a role for the data specialist or the data scientist, but the end is nearing for the idea that you have to be a data scientist to interact, use and leverage your data.”
In specific, familiarity with data analysis will become part of the skill set of ordinary business users, not experts with “analyst” in their titles. Organizations that use data to make decisions are more successful, and those that don’t use data begin to fall behind.
Fink likened this new era of user-friendliness for data analysis to what happened with word processing 20 or more years ago. “Today, everybody has and knows how to use a keyboard and use software for documents. Think back, 20 years ago, you might send your documents to data entry or even the typing pool. That would never happen today. So, just like with documents, everyone should be able to work with their data,” Fink said.
Trend #2. Storytelling with data becomes a priority.
As data volumes explode, there is a stronger need for technologies and skilled people to bring together all the elements that help data render a picture or trend of what is going on – whether it’s a real-time snapshot or trend or even a prediction, Fink said.
“People are realizing they need more than a deluge of pretty data splashed onto their dashboard or a mobile device dashboard,” she said. Rather, they need that data presented with context, and that can require integration, business rules and ways to align data from multiple sources.
In this way, rich context delivers “stories” and makes it easier for business users to derive insight and even ask new questions. “Stories become a way to communicate ideas and insights using data. They also help people gain meaning from an overwhelming mass of big and disparate data,” Fink added.
Trend #3. NoSQL is the new Hadoop.
Organizations will explore in earnest how to use all types of unstructured data. This will mean NoSQL technologies will become more popular as companies seek ways to assimilate this kind of data, Fink said.
“We’re seeing people are recognizing [they can] make sense of unstructured data and gain business value and deliver summary findings out of it. The NoSQL technologies will make it easier for companies to use that data,” Fink said.
While she admits that in 2014, the intelligent use of unstructured data will still be limited, there are other signs of NoSQL’s rising popularity. “We’re also seeing the major RDBMS databases providers are recognizing the power of NoSQL, and their customers are paying attention to it,” Fink added. As more and more large customers look for ways to integrate NoSQL into their traditional database offerings, the vendors are also putting ways to work more closely with NoSQL on their product roadmaps, she said.
Trend #4. Big data from the cloud gets real.
Cloud-based business intelligence will see more mainstream adoption this year, Fink said. . Primary drivers will be cloud data warehouses like Amazon Redshift and Google BigQuery, which will transform the process of building out a data warehouse.
No longer will it be a labor-intensive, months-long process. Using the cloud, companies can have projects up and running in a matter of days. According to Fink. “This enables rapid prototyping and a level of flexibility that previously was not possible. Cloud offerings like Teradata Cloud and SAP HANA from traditional vendors validate the space.”
This cloud capability also means organizations will be more easily able to deliver fast and rich analytics with cloud-based business intelligence, she added. “The maturation of cloud services helps IT departments get comfortable with business intelligence in the cloud. New scenarios such as collaboration with customers and outside the firewall mobile access also accelerate adoption,” Fink said.
New Addition to Tableau’s Cloud-based Analytics Solutions
On this cloud-enabled big data front, Tableau continues to expand its ecosystem. Tableau recently announced a partnership with Treasure Data to simplify the creation and deployments for big data and business analytics projects with rapid time to value.
Under the partnership, Tableau's data visualization tools will extend Treasure Data's capabilities for acquiring, storing, and analyzing data in a managed cloud service. The combination will allow customers to create dynamic visualizations, interactive analytics, and shared insights.
With the Tableau technology directly integrated into the Treasure Data service, customers alleviate the need for big data to be moved out of the cloud, therefore speeding and simplifying the data management process and making big data analytics accessible to more people within the organization.
As companies turn to the cloud for big data deployments, Treasure Data leads the market with the first service to address each phase of the data pipeline. “The joint Tableau and Treasure Data solution helps customers at each stage of the process, thereby reducing management time and costs of the analytic infrastructure,” said Dan Jewett, Vice President of Product Management at Tableau Software, in a statement.
“With the amount of data being produced today, organizations need to reduce process redundancy, more efficiently analyze new big data types and scale their businesses based on data-driven decisions,” said Hiro Yoshikawa, CEO of Treasure Data, in the statement. “Bringing Tableau's trusted and established visualization tools into the Treasure Data service allows customers to more efficiently analyze all types of data, uncovering insights to transform their organization.”
One corporate customer is happy about Tableau/Treasure data partnership. “Tableau's intuitive design helps our staff analyze data, create visualizations and react quickly to changes, while the Treasure Data Service provides a simple and reliable approach for engineering to collect, store and process massive data volumes with low latency,” said Simon Dong in the statement. Dong is a principal architect at Getjar Inc., a mobile application distribution and advertising network.
- Integration is The Next Step to ROI on Big Data Analytics
- Survey: Evan Data Finds App Developers Welcome the Surge in AI, Deep Learning Tools
- MapR Looks To Accelerate Analytics Apps; Adds Event-Driven Microservices to Converged Data Platform
- Talend’s Big Data Sandbox Assembles Key Ingredients and ‘Recipes’ To Explore Ideas and Deliver Success
- SAS Enters Era of ‘Open Analytics’ with Viya Platform’s Focus on Cloud, Open Programming and Machine Learning