Multicore And The Risk Of Being Left Behind - Catalyst IT
NZ companies risk competitive advantage in not understanding multicore computing
Press Release - 24 December 2012
Multicore World 2013
Government and private companies as well as the wider IT community need to understand how multicore computing and parallel processing technology works says Catalyst IT director Don Christie.
“The risk of not understanding the current and ever growing power of multicore computers and the massive increase in power and performance they enable is simply the risk of being left behind,” says the founder of New Zealand’s largest open source IT company.
“An event such as this, with such important subject matter and quality people is really impressive. That’s a major reason we’ve got behind supporting it.”
Christie says the architecture and programming required to fully utilise multicore computing is part of today’s and all of tomorrow’s IT world.
The next Twitter or Facebook, which could easily originate in New Zealand, will be built on a multicore platform. Multicore, and the associated programming that takes advantage of it, hugely increases ability to handle massive data streams and is today’s reality.
IT professionals, entrepreneurs and those looking at where the next big thing will spring from should attend the Multicore World 2013 at the Wellington Town Hall on February 19 and 20 says Christie.
The introduction of ultra-fast broadband will be one way to link to multicore computing’s power, and New Zealand’s involvement in the Square Kilometer Array project to use linked radio telescopes to ‘view’ the stars and universe are just some of the topics up for discussion at the event.
“It is difficult to know where opportunities from multicore and parallel programming will pop up,” says Christie.
“That makes it even more important to attend, learn, discuss, connect and figure where the openings are for your own company or organisation.
“If people ignore the opportunity to attend such a world class conference in their own backyard, they’ll miss being at the edge of new developments, miss being at the heart of today and tomorrow’s computing architecture.”
What is multicore?
The ability of computers to process massive amounts of data has been growing ever since they were invented. As computer power has increased, the speed of processing has reached a physical barrier, and more processing power cannot be put onto a chip without overheating.
The problem has been solved by putting more processors onto a single chip, creating multicore chips. These multicore chips entered the mainstream market a few years ago, and all vendors currently sell them. They are now standard kit in all laptops, desktops and smartphones.
Multicore chips are also more power efficient, and the number of cores able to be added is theoretically virtually unlimited.
Previously impossible computational tasks can now be achieved. And processes which previously took, days or even weeks to perform can now be done swiftly.
But while this new processing power enables computers to do things faster, it also adds new challenges.
Before Multicore computer software was written for a single central processing unit (CPU) in a chip. To exploit the potential of multicore chips, software now needs to be written while thinking in parallel.
But parallel programming is different than traditional programming, and so far few programmers have experience of it.
Multicore is a mainstream but (as yet) niche new technology.
In the next 10-15 years, there will be huge opportunities to translate sequential programming (‘traditional’) legacy code, and to create new software that takes full advantage of thousands of cores in the next generation of chips.
Around the world parallel computing is currently used to process vast quantities of data produced by the internet and the "big data" originating out of social networks and millions of intelligent data recording devices attached to the internet..
Here in NZ it is also used in the biggest CGI rendering facility in the world at Wellington's Weta Digital.
And soon it will be a key component of the information processing required to handle the data produced by the Square Kilometer Array radio - telescope – a global scientific project that New Zealand is a part of.
In addition, there is a wide range of services, solutions and systems integration challenges to connect the two world's together.