Apple often plays an integral role in getting certain core technologies it believes in standardized across the industry. Recently, we discussed Apple’s growing role in WebKit development, and how its work on establishing HTML 5 standards also serves the purpose of making it that much harder for Flash to become a de-facto standard across browsers and platforms.
Elaborating further on the matter, Gizmodo recently explored the reasons why tech standards are so important for Apple, and why they’ve burned so many calories in its efforts to standardize technology such as OpenCL, Firewire, and Mini Display Port.
When you’re the little guy—as Apple was, and still is in computer OS marketshare, with under 10 percent—having a hand in larger industry standards is important. It keeps your platform and programming goals from getting steamrolled by, say, the de facto “standards” enforced by the bigger guy who grips 90 percent of the market.
If you succeed in creating a standard, you’re making everybody else do things the way you want them done. If you’re doubting how important standards are, look no further than the old Sony throwing a new one at the wall every week hoping it’ll stick. Or Microsoft getting basically everybody but iTunes to use its PlaysForSure DRM a couple years ago. Or its alternative codecs and formats for basically every genuine industry standard out there. To be sure, there is money to be made in standards, but only if the standard is adopted—and royalties can be collected.
Can you imagine if .WMV and .WMA files were tech standards today? Yikes.
You can check out Gizmodo’s article in its entirety over here.