In a job market absolutely
stuffed with computer programming positions, few in the general
population care, or should, about the complaints of pampered coders. “Oh, is PHP disorganized and difficult to bug-check? Boo-hoo, you’ll just have to bill some more over-priced hours, I guess.”
But that sort of schadenfreude is self-defeating, since coders create
the tools we use to interact with our world. Even minor frustrations for
them can trickle down to major frustrations for the end user. One of
the biggest such shared frustrations is the phenomenon of “code rot,” in
which quickly advancing standards in hardware and foundational software
leads to more and more conflicts and inefficiencies in existing
programs. Code rot is why programs seem to run worse and worse over time
— because, in reality, they do.
Now, Adobe and MIT are teaming up to try to address the problem of code rot, and remove from the life of the coder the arduous and time-consuming task of manually updating old code for new technological capabilities. The project is called Helium, and its goal is to build software solutions capable of taking old code and optimizing it for new CPU task-sharing technology, newly efficient GPU architecture, new hardware-level security features, and more. It could save coders months of hard work on just a single project and, more importantly, it could save software users quite a bit of time, money, and frustration.
So
far, Helium has produced a single proof of concept study that centers
on Photoshop image filters. Basically, they looked at all the commands
that were coming out of Photoshop to the CPU when applying a certain
filter, and compared this to the actual on-screen changes that result
from these commands. The comparison can provide a software “auto-tuner”
with the information necessary to see which commands are superfluous,
and which could be made quicker by making use of new hardware
capabilities.
As revealed in their proof of concept study published several months ago, the new version of the filter that resulted from these observations performed almost 75% faster, written more efficiently and in a more modern image processing language called Halide. The researchers admit that they picked an ideal candidate for this type of optimization. But still, 75% is an impressive achievement.
Adobe
is a logical partner for the Helium project, for the same reason
Photoshop was the logical choice for initial test code: Photoshop is the
quintessential legacy program, and Adobe’s remarkable staying power has
the perverse effect of making it harder for them to keep software
efficient than almost anyone else. They invest enormous numbers of
worker-hours in keeping their code from going rotten over time — oh, if
it could just keep itself from doing so.
However, this remains primarily an MIT-driven initiative, which means that the fruits of this research will be widely available. Played out over the long term, this sort of approach could lead to code that is sold in an arguably slightly unfinished state, to be specifically finished off during the installation process as dictated by a scan of the user’s specific hardware setup.
Self-optimizing software could seriously reduce the workload for coders all over the world — but it’s probably still worth doing, all the same.
Now, Adobe and MIT are teaming up to try to address the problem of code rot, and remove from the life of the coder the arduous and time-consuming task of manually updating old code for new technological capabilities. The project is called Helium, and its goal is to build software solutions capable of taking old code and optimizing it for new CPU task-sharing technology, newly efficient GPU architecture, new hardware-level security features, and more. It could save coders months of hard work on just a single project and, more importantly, it could save software users quite a bit of time, money, and frustration.
As revealed in their proof of concept study published several months ago, the new version of the filter that resulted from these observations performed almost 75% faster, written more efficiently and in a more modern image processing language called Halide. The researchers admit that they picked an ideal candidate for this type of optimization. But still, 75% is an impressive achievement.
However, this remains primarily an MIT-driven initiative, which means that the fruits of this research will be widely available. Played out over the long term, this sort of approach could lead to code that is sold in an arguably slightly unfinished state, to be specifically finished off during the installation process as dictated by a scan of the user’s specific hardware setup.
Self-optimizing software could seriously reduce the workload for coders all over the world — but it’s probably still worth doing, all the same.
0 comments:
Post a Comment