Sterns comments come as a debate is raging around the proliferation of Open Source Initiative-approved licenses. The OSI is working to stem the tide of endless new open-source licenses by refusing to approve new ones that essentially duplicate the ones that have come before.
At the same time, the OSI is trying to bring sanity to the multiple older open-source licenses were already stuck with by reclassifying them as "preferred," "ordinary" (aka approved) or "deprecated."
There was also no security through obscurity, Stern said during his keynote, adding that revealing the source code did not make a product more insecure. "In fact, having more eyes looking at it often makes it more secure, but open source software does not alleviate the need for good architecture," he said.
Stern also talked about the five most popular open source business models, which ranged from content subscription, where users wanted certain content and the vendor provided this; to a model that integrates the stack together, where people are willing to pay someone else to do this for them; to the deployment support service model that provides customers with updates, patches and the like in an automatic and seamless way.
The software-hosted-as-a-service model was on a rapid growth trajectory, like Google Search, where integration and service level agreements were what brought in the revenue; while the fifth model, embedding derivative works, was where value was added on top of the stack, rolled up and offered to customers, he said.
"Todays teenagers have always used software as a service, it is available to them over the Internet and they just consume it. This is the generation who will be buying software in the next 10 years," Stern said.
On the data front, the trend was towards more metadata than data, with companies like Google, with Google Maps, taking a couple of streams of data and transforming them and coming up with another set of data that was useful and people wanted, he said.
"It drives us though, in this highly assembled world, to think more about security," he said, adding that one of the ways to deal with this was through virtualization, but finding ways to automate while driving the level of virtualization was the biggest challenge here, Stern said.
As first reported in eWEEK, Red Hat plans to work with the Xen community and others on virtualization to ensure maximum system utilitization and availability by optimizing the core operating system platform for virtualized environments. It is also heading the drive to get this technology into the Linux kernel.
"The bottom line is you have to think about what if that constraint you have always assumed to be true was not true any longer. What if the cost of your software goes to zero? What if the cost of administration goes up more than you expect? As we break existing constraints we have to balance this on the risk/reward ratio," he said.
The things that were happening in the social sphere were affecting technology, where consumers were also now creators, and those who were using software were increasingly becoming contributors and participating in the community.
"Its going to be an interesting few years, and a tremendous amount of opportunity is being created, not only for existing companies, but also for the establishment of new business models," Stern said in conclusion.