Malicious software, also known as malware, has become as ubiquitous as the computing devices that we use on a daily basis. There is no question that malware tends to be as sophisticated as its intended targets, but security researchers are now concerned about a new breed of malware that could attack smart software such as Apple’s Siri, Google Now, and Microsoft Cortana.

How Malware Takes Over Your Electronics

When a computer virus infects a system, it may stay in a dormant state until its code instructs it to execute and propagate; this is the most common mechanism that computer users in Apple Valley, Hesperia and Victorville are protected against when they install antivirus software in their desktops and laptops.

The infection and execution mechanism described above has been used by hackers for many decades, long enough for computer security experts to develop protective measures against them. The key is to be able to detect a certain footprint so that the antivirus software can remove the malware and repair the system.

Machine learning and artificial intelligence are used to a certain extent when developing antivirus software and Internet security suites. Based on prior malware detection, a security program can apply an algorithm that will help it learn malware variations.

Vectors of Attack in Today’s World

When customers bring malware-infected computers to PC Performance Pros in Victorville, one of the actions taken by technicians is to investigate the attack vector; in other words, determine how the virus entered the system in the first place. In many cases, the attack vectors are suspicious attachments or rogue websites distributing malicious JavaScript applets.

The latest attack vector is related to the growth of artificial intelligence and its fast rate of adoption. Smart personal assistant such as Siri, Cortana and Google Now are now installed on smartphones and tablets around the world, and they are constantly listening for input and verbal commands.

Technology researchers believe that smart personal assistants could be easily duped by a mix of white noise and human voice snippets, which could direct the smart app to perform an action such as emailing contact lists to an unknown address, open up Bluetooth ports for listening, pulling up the URL of a rogue website to deliver malicious code or trick the device into joining a botnet.

Security analysts at Google are currently looking into ways of securing future versions of Google Now, a smart personal assistant that is often used to handle personal information from smartphone and tablet users. Until then, they recommend not setting up Google Now to listen continuously.