The Internet of Things (IoT) is going to be big. According to Microsoft chief executive, Satya Nadella, IoT will generate 44 zettabytes of data in the future. That's a lot of 1s and 0s.
Nadella made the claim during his keynote speech at Convergence 2015 in Atlanta, as part of Microsoft's push around IoT on its Azure cloud platform. This is a collection of cloud-powered tools designed for companies looking to build software that exploits data collected from a myriad of sources.
It is data that is underpinning Microsoft's approach to IoT, with Nadella stating it will drive the direction of the company. "Devices will come and go," he said. "The most interesting thing is the data that's being collected."
Nadella said Microsoft has a prevailing objective to find ways that seek out the value in data harvested from IoT devices.
The majority of the products announced during Convergence 2015 all had data use at their core. Nadella explained how Microsoft wants to empower people and businesses to use the insightful information that can be gleaned from diverse datasets.
Nadella even went so far as to showcase his own use of data, revealing to thousands of onlookers his fitness and wellness data recorded by the Microsoft Band he wears.
V3 was interested to see that the chief executive of one of the largest companies in the world still manages to find time to exercise and get over seven hours sleep.
Nadella also introduced Seattle Seahawks star quarterback Russell Wilson on stage to discuss how data can be used to monitor players' performances and vital signs.
Given Microsoft's investment into what it calls a ‘hyper scale' multi-purpose cloud, it likely has the capability to handle 44 zettabytes of IoT data. But the Redmond company will be entering an area that is rapidly filling up with other technology giants, such as Intel, ARM and IBM.
10 May 2013
President Obama recently signed an executive order that will require government organisations to release their data to the public in an easily digestible form. The White House says that the move will give entrepreneurs and innovators the information they need to create engaging new products.
To see what sort of potential this government data holds for the private sector, look at the past. Prior to the 1980s GPS data was mostly relegated to military and government organisations. It wasn't until 1983 and the release of GPS data to the public that consumer mapping technology really began to take shape.
At the time, then-President Ronald Reagan ordered GPS data become freely available to the public. Reagan's decree came following the crash of Korean Air flight 007, which was shot down after getting lost and flying to near Soviet airspace.
By 1989, US company Magellan released the first commercially available portable GPS system. The Magellan NAV 1000 used GPS data from government satellites to put GPS right next to a citizen's Walkman.
The release of the device, and the government data it used, is why we have GPS navigation today. From Google Maps to Apple Maps, all of the world's most basic turn-by-turn navigation wouldn't be possible without government data.
The example goes to show that the government has types of data that would be almost impossible to get without an open initiative to release it. The US government has the resources to do things that the private sector cannot.
Government agencies have the abilities and options to collect massive amounts of data on things that private firms would never spend money on. If it wasn't for the military's work with GPS, the private sector could be years behind what today's mapping apps are capable of.
A startup would never be able to map the globe or launch a satellite for the sake of a navigation app. By giving out government data, the Obama administration has opened the door for clever entrepreneurs to use data without doing the legwork.
Oracle chief executive Larry Ellison once called cloud computing a fad. Well, that fad is now making Oracle money.
According to an IDC study, Oracle software revenues grew over three percent on the back of big data and cloud software. According to the study, Oracle owned a 21 percent share of the application development and deployment (AD&D) software market.
Ellison unveiled Oracle's first IaaS offering last October at OpenWorld. Now, the firm is seeing growth in the software market. Oracle's slow march to cloud software looks to be paying off this year.
However, Oracle still has a while to go before it can truly compete in the evolving world of enterprise software. The firm still lags behind IBM and Microsoft in overall enterprise sales.
Microsoft has been the top dog in the enterprise software world for a long time. Its grasp on the market doesn't seem to be slowing anytime soon either with a 17 percent share of the sector.
However, Oracle can still look to get into the number two spot by leaping over IBM. Big Blue didn't see much growth in enterprise software revenues for 2012. The firm only registered a little under a one percent growth mark year-over-year.
That in comparison to Oracle growth could mean big things for the firms future. If Oracle continues its trend towards growth, and IBM continues to stay consistent, it can take the number two spot.
To do that Oracle would need to continue to bring out cloud software offerings going forward. The world of enterprise is increasingly becoming cloud-centric. IDC reported that the cloud would be a major growth sector moving towards 2015 and that has proven to be coming true.
For Oracle to capitalise on that growth it would need to continue its push towards the ether. Oracle's executives have been hesitant to embrace the cloud on the level of some of its competitors, but it's getting there.
It will be interesting to see if Oracle will attempt to bring out ground-breaking cloud software offerings in the future or if it will stand content to just play catch up with its competitors.
The firm's Q3 earnings were less than impressive. Commentators mentioned that a probably cause of the poor earnings was the firm's failure to offer compelling cloud services.
The products the firm released over the second half of last year were a good start. However, to truly capitalise on the cloud market it will need to do more by stop playing catch up and start being an innovator.
11 Apr 2013
Google just announced its plans to enlist big data in the fight against human trafficking. The search giant will work with three advocacy groups to collect and analyse data from human trafficking hotlines.
The work is aimed to stifle human trafficking by bringing about a shared data platform for anti-trafficking groups. By using big data, advocacy groups can identify trafficking hotspots and create stronger strategies to put an end to traffickers.
Google's work in the field is an illuminating reminder of the types of projects big data can take on. Big data doesn't have to be used just to create the perfect targeted ad or discover the biggest IT bottleneck.
Big data can also be used to solve a variety of the world's ills. The potential big data holds for the greater good can't be underestimated. From being able to project future crime sprees to solving big city traffic jams, big data holds the key to fighting a variety of societal troubles.
That is one of the reasons why the lack of qualified big data analysts is so troubling. We can have all the data in the world but if we don't have qualified analysts it won't mean anything.
Knowledgeable and creative big data scientist will be crucial if the industry ever hopes to create some sort of major social change. The world will need scientists who not only know what they are doing technically, but also have the creativity needed to use data in unique ways.
Last year, Oracle president Mark Hurd made the comment that most big data is "worthless". According to Hurd, 99.9 percent of big data is unusable.
His assessment may hold weight in the sense that most data will not help a business improve its infrastructure. However, the idea that most big data is useless in the greater context of society is off base.
To truly use data to uncover societal truths we need imaginative analysts, who can take seemingly benign data and transform it into real-world solutions.
By now it's become a cliché to say the world needs more Steve Jobs, but it's the truth. Steve Jobs (and the many pioneers of the computing age) took the technology of their time and brought a sense of creative thinking to it.
We need a generation of Steve Jobs. The technology exists to such a point that creative thinking can change the world. Tech like big data can be used to revolutionise how we think about the world's problems.
With creativity and know-how a data analyst can do amazing things. Not just in business, but also for society as a whole.
Now, it's up to clever people to take up an interest in the field. To do that people will need equal parts ingenuity and opportunity. They'll need the opportunity to learn and discover the power of the trade. They'll also need to understand big data is more than just statistics.
Facebook doesn't make money on hardware, software, or subscriptions. Instead, they make money on the data users put out. They take the data users send out and sell it to advertisers who in turn sell users stuff through the use of targeted ads.
The idea that major corporations sell users data scares a lot of people. These people don't necessarily have anything to hide; they're just ordinary people who like to have a sense of privacy.
These people use Gmail, Facebook, and Google+. Some of them will even probably end up using Facebook Home.
These potential Facebook Home users spoke up about their fears that the app/skin/thing would invade their privacy in a way unheard of previously. So Facebook went on the offensive and dropped a Q&A for Home's privacy policies.
The Q&A basically said Facebook Home doesn't change the way the company handles user data. User's location data won't be collected in anyway that is unique and it won't collect data users create from other apps.
So if nothing changes then what is the end game? Why is Facebook making a free super-app that doesn't do anything new for advertisers? Because by putting itself on your home screen, Facebook can gleam a lot more data using the same policies.
By buying into Facebook Home users will be sort-of using a Facebook ecosystem. Facebook already has an app store which has the potential for growth. It also has a messaging service and a slew of other apps users could use to replace their current Android offerings.
The famous Microsoft "Scroogled" campaign derided Google for searching through Gmail messages to serve up sponsored ads. Google uses all of its apps to give advertisers some new kinds of data.
Now Facebook is doing the same thing as its semi-rival Google. It's building out an ecosystem in attempt to better understand how to sell its users stuff. So if you are the type to worry about Facebook Home's privacy policies, you should be less focused on Home and more focused on Facebook as a whole.
Facebook's current privacy policies are the real issue, not the future violations of an unreleased app. If anything is to be done, it should be getting Facebook to update its current policies to better adapt to mobile.
The company has already defined itself as a mobile company so perhaps it should make privacy policies that reflect that. If Facebook really wants to talk up its privacy agenda, it needs to really work to change what its current policies are and not to talk about what its doing with a new app.
03 Apr 2013
When it comes to data, it may be easy to assume that size really does matter. There's nary a tech company going that hasn't attempted to jump on the big data bandwagon, with the common consensus being that in the search for truth, the more data points the better.
It's not an argument that convinces Kate Crawford, a principal researcher at Microsoft Research. As she points out in a newly published piece for the Harvard Business Review, massive data sets and predictive analytics do not always reflect objective truths. Sometimes big data is victim of bias.
Crawford points to the example of a study into Hurricane Sandy, which lashed the East Coast of the US in late October 2012.
In it, researchers examined more than 20 million tweets, combining it with data from Foursquare, to uncover a detailed picture of how people responded to the terrifying events. And in short, they stocked up on food before the storm hit, and went out partying after it was over. Or so the data tells us.
The problem is, most of the data came from Manhattan hipsters, where it might be assumed, use of Twitter was higher than average.
“Very few messages originated from more severely affected locations, such as Breezy Point, Coney Island and Rockaway,” notes Crawford.
Other big data studies have illustrated similar problems, such as Google's massive over-estimation of the number of US flu cases this winter. Google Flu Trends predicted 11 percent of the population would contact the illness, while the Centers for Disease Control and Prevention estimated the figure to be just six percent.
But while there may be many that would welcome less breathless excitement about so-called big data, Crawford's thinking doesn't seem the end of tooth-grindingly awful buzz phrases.
“We get a much richer sense of the world when we ask people the why and the how not just the "how many,” she wrote.
“We can move from the focus on merely "big" data towards something more three-dimensional: data with depth.”
Deep data. Whatever next?
Earlier this week, Canonical and MapR teamed for an announcement that could signal a change in the way we see big data platforms.
The pair said that the latest versions of Ubuntu would be bringing support for MapR's Hadoop database management and development platform. Now, Ubuntu users will be able to access data from Hadoop deployments.
The move also signals that interest in big data is on the rise. Once the privy of a select few search giants and academic institutions, developers and administrators can now enjoy access to their big data systems with little more than a Linux-equipped notebook.
This could prove to be a key point in the expansion of the big data market into the larger enterprise space. While hardware vendors want to push the systems in order to sell their latest and most powerful products, the basic tools for operating these huge clusters need to be accessible to those who will be controlling and analysing big data platforms.
The move will also be a great help to the education space, where qualified data analysts are in short supply. With many university programmes still in the early phases of developing any sort of big data analysis training programmes, having tools available on existing platforms and familiar systems such as Ubuntu could prove critical.
27 Mar 2013
Oracle continues to struggle to sell servers. Earlier this month the firm reported that third quarter revenue for its servers were down 23 percent year-over-year. Overall the company failed to compete with competitors, struggling to amass a meagre four percent market share.
The Larry Ellison-led company has failed to find the secret sauce necessary to interest firms in its line of hardware. Oracle inability to compete with rivals has really hurt the company's bottom line over the past few years.
While the company recently launched a new line of Sparc T5 and M5 servers, it is yet to be seen if that will be enough. Even if the servers turn out to be amazing, Oracle's real problem is its strategy of designing entire systems for only Oracle gear.
The firm runs on the idea that by doing it all themselves they can create the best systems. Not to mention, build out platforms that require all-Oracle software and hardware.
Unfortunately, for Oracle the hardware industry no longer works without some form of co-operation.
Take HP for example, its Pathfinder programme sees the firm working with other tech firms to create ARM servers for its Project Moonshot programme.
The programme sees HP co-developing servers with hardware and software vendors. HP's approach is different from Oracle's in that it aims to build servers that are not so closed and proprietary.
Oracle still exists on an old way of thinking. The firm believes that a business can get away with offering a proprietary system. But in today's infrastructure that is not true. Users want choice and the ability to not be bogged down by a single option.
HP will need to focus on open platforms if it wants to turn things around. And under Meg Whitman it looks like its going that way. Project Moonshot is a good example of a new paradigm that HP is creating. Overtime that shift could mean big things for the firm's ability to take some of Oracle's business.
Things at HP and Oracle are both quite bumpy at the moment. But one firm is making the smart move - at least when it comes to servers. HP sees a future more in line with what smaller firms like Salesforce are doing. Oracle, however, is struggling to adapt.
The idea that Oracle doesn't "get it" isn't necessarily anything new. Salesforce chief executive and Larry Elision's mortal enemy Marc Benioff said something similar back in 2011. But Ellison and Oracle still don't get it.
HP is adapting, even IBM is adapting, but Oracle just doesn't get it.