Magic Quadrant for Business Intelligence and Analytics Platforms

Gartner released in february its Magic Quadrant for Business Intelligence and Analytics Platforms ID:G00239854

Analysts were: Kurt Schlegel, Rita L. Sallam, Daniel Yuen, Joao Tapadinhas

Bildschirmfoto 2013-02-22 um 10.57.03

So, what’s new?

The example of Birst and GoodData shows BI’s future is in the cloud. So far all cloud solution have to live with a hybrid of deployed in the cloud or in an on-premises appliance. Birst success in winning deals is based on its functional breadth, depth and strength, ease of use and low cost of ownership value proposition. This makes Birst the “new darling” of the Magic Quadrant.

“Increasingly, Gartner sees more organizations building diagnostic analytics that leverage critical capabilities, such as interactive visualization, to enable users to drill more easily into the data to discover new insights. For example, visual patterns uncovered in the data might expose an inconsistent supply chain process that is the root cause of an organization’s ability to consistently reach its goal for on-time delivery.”

“If there were a single market theme in 2012, it would be that data discovery became a mainstream architecture. For years, data discovery vendors — such as QlikTech, Salient Management Company, Tableau Software and Tibco Spotfire — received more positive feedback than vendors offering OLAP cube and semantic-layer-based architectures.”

Featured companies:
Actuate; Alteryx, Birst, Bitam, Board International, GoodData, IBM, Information Builders, Jaspersoft, LogiXML, Microsoft, MicroStrategy, Oracle, Panorama Software, Pentaho, Prognoz, QlikTech, Salient Management Company, SAP, SAS, Tableau Software, Targit, Tibco Spotfire

Other Vendors to Consider:
1010data, Advizor Solutions, Altosoft, Dimensional Insight, eQ Technologic, InetSoft, JackBe, Jedox, myDials/Adaptive Planning, Phocas, SpagoBI, Strategy Companion, Yellowfin

 

Read the Gartner’s Summary hier: Magic Quadrant for Business Intelligence and Analytics Platforms

 

 

SmartCamp Winner Spotlight: Streetlight Data

StreetLight Data describes how people use their city. What web analytics does for e-commerce, we do for in-store commerce. We provide metrics like: for this corner on a typical Tuesday afternoon, what percent of people coming by are over 40? Make less than $60k/year? How many are going shopping? For the retail ecosystem this means answering questions that have never before been answered: Who is coming by my store and NOT coming in? We also open up insights that can radically improve transportation and urban planning. Our solutions depend on our Route Science(TM) engine, which puts messy transportation data in context, leveraging archival data exhaust from the location-based services industry.

 

Read more on: <a title=”IBM Smart Camp Streetlight” href=”http://ibmsmartcamp.com/2012/09/20/smartcamp-winner-spotlight-streetlight-data/”>IBM Smart Camp</a>

BellaDati – makes big data directly accessible to business leaders

Currently, big data is a melting pot of distributed data architectures and tools like Hadoop, NoSQL, Hive and R. But, there are new companies emerging offering a toolset to make big data accessible to business leaders.

BellaDati is a fit-to-purpose products that abstract away as much of the technical complexity as possible, so that the power of big data can be put into the hands of business users.
[youtube http://www.youtube.com/watch?v=Kwm0SP_hJQQ]

BellaDati’s idea is to provide the world with a tool to reinvent the way business users interact with their data.

BellaDati is offering solutions for several industies so far:

This trend of simplifying the access to data knowledge will change the BI landscape as we know it. Martin Trgina the CEO and founder of BellaDati has this vision and manifested it by the mission statement of BellaDati:

“We believe everybody should have an answer to data questions without waiting and in a nice design. We believe that everyone can love the BI. So — we made BellaDati.”

Algorithms Are Taking Over The World : Christopher Steiner at TEDxOrangeCoast

Christopher Steiner is the author of Automate This (2012) and $20 Per Gallon, a New York Times Bestseller (2009). He is a cofounder at Aisle50, a Y Combinator company that sells grocery deals through the Web. Before starting Aisle50 in 2011, Steiner was a senior writer covering technology at Forbes magazine for seven years.
His writing has also appeared in The Wall Street Journal, the Chicago Tribune, Fast Company, MIT Technology Review and Skiing Magazine. He holds an engineering degree from the University of Illinois at Urbana-Champaign and a masters in journalism from Northwestern University. Steiner lives in Evanston, Ill., with his family.

About TEDx.
TEDx was created in the spirit of TED’s mission, “ideas worth spreading.” The program is designed to give communities, organizations and individuals the opportunity to stimulate dialogue through TED-like experiences at the local level. At TEDx events, a screening of TEDTalks videos — or a combination of live presenters and TEDTalks videos — sparks deep conversation and connections. TEDx events are fully planned and coordinated independently, on a community-by-community basis.

Werner Vogels ( CTO, Amazon) is pointing out three key trends in cloud computing.

During the panel discussion on cloud computing at LeWeb 12 in Paris, Werner Vogels highlights three key trends:

1. Data Analytics

  • Companies are analyzing data sets for a deeper understanding of their customers.
  • What are their customers doing, how are they operating and how are they using the companies products.

2. Big Science

  • Accelerated science through computing power, making today calculation and search in three howers, which has taken before six month.

3. Mobile

  • All your data is in the cloud the device is just a window into it.

The panels name: Our heads are in the Cloud!
Moderator: Robin Wauters, European Editor, The Next Web
Panelists:
Aditya Agarwal, Vice President of Engineering, Dropbox
Brad Garlinghouse, CEO, YouSendIt
Werner Vogels, CTO, Amazon

The Pragmatic Definition of Big Data by Mike Gualtieri

Mike Gualtieri says; forget about the three Vs

Big data is not defined by how you can measure data in terms of volume, velocity, and variety. The three Vs are just measures of data how much, how fast, and how diverse? A quaint definition of big data to be sure, but not an actionable, complete definition for IT and business professionals. A more pragmatic definition of big data must acknowledge that:

  • Exponential data growth makes it continuously difficult to manage — store, process, and access.
  • Data contains nonobvious information that firms can discover to improve business outcomes.
  • Measures of data are relative; one firm’s big data is another firm’s peanut.

A pragmatic definition of big data must be actionable for both IT and business professionals.

The Definition Of Big Data

Big Data is the frontier of a firm’s ability to store, process, and access (SPA) all the data it needs to operate effectively, make decisions, reduce risks, and serve customers.

To remember the pragmatic definition of big data, think SPA — the three questions of big data:

  • Store. Can you capture and store the data?
  • Process. Can you cleanse, enrich, and analyze the data?
  • Access. Can you retrieve, search, integrate, and visualize the data?

Hear me explain this definition on a special episode of Forrester TechnoPolitics: The Pragmatic Definition of Big Data Explained

Read more on: Forrester Blogs

Collaboration is the future, According to Amazon Web Services Chief Data Scientist Matt Wood

Once data makes its way to the cloud, it opens up entirely new methods of collaboration where researchers or even entire industries can access and work together on shared datasets too big to move around. “This sort of data space is something that’s becoming common in fields where there are very large datasets,” Wood said, citing as an example the 1000 Genomes project dataset that AWS houses.

dnanexus

 

DNAnexus’s cloud-based architecture

The genetics space is drooling over the promise of cloud computing. The 1000 Genomes database is only 200TB, Wood explained, but very few project leads could get the budget to store that much data and make it accessible to their peers, much less the computation power required to process it. And even in fields such as pharmaceuticals, Amazon CTO Werner Vogels told me during an earlier interview, companies are using the cloud to collaborate on certain datasets so companies don’t have to spend time and money reinventing the wheel.

Please continue reading the artikel writen by on gigaom.com

Introduction to Hadoop by Bill Graham (@billgraham)

Very nice introduction of Bill Graham (@billgraham) into Big Data and Hadoop.


UC Berkeley School of Information has a great course, where UC Berkeley professors and Twitter engineers are lectureing on the most cutting-edge algorithms and software tools for data analytics as applied to Twitter microblog data. Topics include applied natural language processing algorithms such as sentiment analysis, large scale anomaly detection, real-time search, information diffusion and outbreak detection, trend detection in social streams, recommendation algorithms, and advanced frameworks for distributed computing.
Bill Graham (@billgraham), who is active in the Hadoop community and a Pig contributor, gave a very clear and detailed intro to Hadoop and outlined how it is used at Twitter. His slides can be found here.

Follow the course on :
UC Berkeley Course Lectures: Analyzing Big Data with Twitter

Creative Data Agency from Germany