Big Data & Brews Video Explains How Pivotal’s Hadoop Distribution Is Different

big-data-brews

In a video interview for the Big Data & Brews series, Pivotal’s Chief Scientist Milind Bhandarkar shares a beer with Datameer’s CEO Stefan Groschupf and provides an overview of the many features that differentiate Pivotal’s Hadoop distribution from the rest, including an overview of HAWQ, GemFire XD, and Spring XD. He also speaks of the future possibilities of including GraphLab, Open MPI and Apache Spark.

Partner 101: How To Do Business With Cloud Foundry

featured-cf-integration

In this first of a two part blog, Pivotal’s partner manager Nima Badiey provides an overview answering key questions interested partners may have when investigating possibilities with Cloud Foundry and its ecosystem of commercial and open source offerings. He walks through how Cloud Foundry provides a platform for ISVs, solution providers and application services to integrate with the CF PaaS, how partners can participate in the Cloud Foundry ecosystem, and how to work with Pivotal on our enterprise distribution of Cloud Foundry, called Pivotal CF.

PaaS Judo: With Cloud Foundry Ippon Deploys in 2 Minutes Versus 2 Weeks

featured-casestudy-ippon

Ippon Hosting CTO, Ghislain Seguy, recently shared a profound point about Cloud Foundry, “What used to takes 2 weeks now takes 2 minutes on PaaS. In addition, Cloud Foundry is the leading choice for open source PaaS from my perspective. Before, our development environments often took 2 weeks to set up, and an integration or test environment would take more. Now, our development environments take 2 minutes, and my infrastructure team is freed up to work on more strategic efforts.” This week at Devoxx in Paris, France the Ippon team will unveil their new developer PaaS offering and share their story of trying out open source Cloud Foundry and building out a new PaaS service using Pivotal CF in just a few weeks.

Pivotal Debuts at ApacheCon North America 2014: Thanks For Having Us!

featured-apachecon-recap

Pivotal’s Roman Shaposhnik reviews ApacheCon 14, which took place last week in Denver. At Pivotal’s self-described “coming out party” to the Apache Software Foundation, we worked to make an impressions by starting off with a keynote, providing and attending various sessions and even hosting a cocktail party. The amount of positive feedback and press coverage proved once again that being transparent and forthcoming is the key to winning trust in open source communities. In this review of the event, Shaposhnik also points community members to some of the newer technologies he believes are hot to watch and use right now.

Time Series Analysis Part 3: Resampling and Interpolation

featured-TimeSeries

The previous blog posts in this series introduced how Window Functions can be used for many types of ordered data analysis. Time series data can be found in many real world applications, including clickstream processing, financial analysis, and sensor data. This post further elaborates how these techniques can be expanded to handle time series resampling and interpolation.

The Industry and Press Responds to Pivotal’s Bold Big Data Suite Move

featured-big-data

Pivotal’s bold move last week to offer its Big Data Suite as a subscription service received plenty of attention from major industry players and the press. The Pivotal Big Data Suite enables enterprises to realize the business data lake by charging not for storage but analysis, by processing core.

EMC Q&A Part 2: How Spring and tc Server Reduce Costs and Improve Productivity

Jim Nuzzo, Enterprise Architect, EMC

How did EMC reduce costs and increase productivity with Spring? This article is part two of our Q&A session with EMC Enterprise Architect, Jim Nuzzo. The article provides several real-world examples of how Spring began to drive lower costs and increase productivity at EMC. From both an executive and developer standpoint, Nuzzo also explains why and how Spring accomplishes these results. Lastly, he explains the approach and outcomes of a large enterprise integration project based on Spring.

Pivotal’s New Big Data Suite Redefines the Economics of Big Data Including UNLIMITED Hadoop to Enterprises

featured-big-data

Today, Pivotal is changing the economics of Big Data forever with the launch of the Big Data Suite. Aimed squarely at defining the economics through a customer lens, delivering flexible big data computing to drive rapid adoption in the enterprise, the new offering is an annual subscription based software, support, and maintenance package that bundles Pivotal Greenplum Database, Pivotal GemFire, Pivotal SQLFire, Pivotal GemFire XD, and Pivotal HAWQ, into a flexible pool of big and fast data products. It is subscription based, and priced per core—and interestingly, is delivered with unlimited use of supported Pivotal HD, Pivotal’s commercial distribution of Hadoop. Aggressively priced, the new Big Data Suite provides simple usage based subscription terms with the variety of proven data services that enterprises need to be sure they can take full advantage of all their data needs, present and in the future.

This Month In Data Science

featured-datascience-march

With GigaOm’s Structure Data Conference, the Pivotal HD 2.0 release announcement, and big cloud platform announcements from Google and Amazon, March was an eventful month for data science and the platforms on which it is exercised. Here’s our top picks for the data science news items of the month, both from Pivotal and the entire field.

The Green PaaS: Innovative Data Centers Will Save the World Billions

featured-greenpaas

Digital information worldwide represents 10% of the world’s energy consumption, and the costs are only increasing as our dependence and usage of data is accelerating. Fjord IT, a solutions provider in Europe that is dedicated to green computing, has found that in addition to green energy practices, PaaS is a key ingredient toward a greener future. Their work with a large media company using Pivotal CF, powered by Cloud Foundry, helps reduce energy costs with a more consolidated data center, and also has the potential to improve operational efficiency by up to 90% and reduce development time by up to 47%.