Computing has achieved so much today. If you look at computer systems now and compare to that of ten to twenty years ago, much has definitely changed and improved. These advances in computer programming enabled technologies to reach greater heights – advancements we can only but dream of during computing’s infancy.
However, its evolution also has its pros and its cons. Computing systems are far more efficient now than earlier models. You can do multiple tasks at once and the Internet has become an essential aspect of life. Unfortunately, advances in technology mean that robots can now do manual labor that is done by people before. It means people can lose their jobs and have a hard time in life because of limited opportunities – especially for skilled workers who lacks in education.
Moreover, technology has access to more data but we have no idea how these data are used, transmitted and stored. Hacking becomes a bigger threat not only to the government and big companies but even to ordinary citizens as well.
“A dream of artificial intelligence is to build systems that can write computer programs.”
Coding has been described as one of the most important skills of the future, and a recent survey from job markets firm Burning Glass found that as many as seven million job openings in 2015 required some form of coding skills.
But with AI now having the ability to code itself, it could put many budding coders out of work.
A recent report from the United Nations (UN) revealed AI is set to displace millions of workers across the globe as scientists storm towards making machines with human-level intelligence.
While many firms will welcome the news of free labour that will be more efficient than humans, it will leave many people worried about their economic future.
And there is no stopping technology from doing the impossible and changing the society that we know.
Scientists have trained a quantum computer to recognize trees. That may not seem like a big deal, but the result means that researchers are a step closer to using such computers for complicated machine learning problems like pattern recognition and computer vision.
The team used a D-Wave 2X computer, an advanced model from the Burnaby, Canada–based company that created the world’s first quantum computer in 2007. Conventional computers can already use sophisticated algorithms to recognize patterns in images, but it takes lots of memory and processor power. This is because classical computers store information in binary bits–either a 0 or a 1. Quantum computers, in contrast, run on a subatomic level using quantum bits (or qubits) that can represent a 0 and a 1 at the same time. A processor using qubits could theoretically solve problems exponentially more quickly than a traditional computer for a small set of specialized problems. The nature of quantum computing and the limitations of programming qubits has meant that complex problems like computer vision have been off-limits until now.
In the new study, physicist Edward Boyda of St. Mary’s College of California in Moraga and colleagues fed hundreds of NASA satellite images of California into the D-Wave 2X processor, which contains 1152 qubits. The researchers asked the computer to consider dozens of features—hue, saturation, even light reflectance—to determine whether clumps of pixels were trees as opposed to roads, buildings, or rivers. They then told the computer whether its classifications were right or wrong so that the computer could learn from its mistakes, tweaking the formula it uses to determine whether something is a tree.
All these technological breakthroughs can overwhelm an ordinary citizen who only understands basic computer programming and terminologies. And even though common sense dictates us that hacking is wrong, it may eventually prove beneficial after all.
As the world around us becomes more connected to the internet, the number of ways that hackers can infiltrate our lives becomes increasingly multifarious. Today data breaches are taking place in ways that were unheard of just a decade ago — from remotely hacking cars to infiltrating “smart” teddy bears.
The threats have grown so quickly that companies are overwhelmed by the increasing number of attacks, security experts say. This is not just because of the growing number of opportunities to infiltrate a network or device, but also because these attacks are increasingly automated and launched from low-priced computer hardware using open-source tools that require relatively low coding skills to deploy. Defending against such attacks can require well-paid and highly trained experts.
“We believe that cybersecurity is a correctable math problem that, at present, overwhelmingly favors the attackers,” Ryan M Gillis, vice president of cybersecurity strategy for enterprise security company Palo Alto Networks, said at a House Homeland Security Committee meeting last week about protecting the private sector from hacking. “Network defenders are simply losing the economics of the cybersecurity challenge.”
Technology is rapidly growing and progressing and there is no stopping it. It makes perfect sense to embrace the world of computer programming in our lives considering how much of everything we do revolves around computers and technology. Having at least a basic understanding of it can save you from a lot of headaches that has to do with technology – whether at home or at work.