Unfortunately for attendees of the BlackHat and DefCon 2017 conferences on IT and cyber security, this could become all too true. Unlike others on wild sprees in the gambling capital of America who would dearly like to leave all the evidence behind them, these conference attendees could be leaving valuable personal or company data, if the super-hackers have their way.
From a hacker’s point of view, healthcare has it all. There’s confidential personal data in abundance, including health information and social security details, mobile apps with sloppy security, and healthcare institutions and end users who can’t or won’t accept their vulnerability as targets. The road to IT security hell is being paved with good intentions and a strong dose of denial (“it’ll never happen to our clinic”). Consequently, cybercriminals have been flocking to healthcare sector to partake in the pickings.
The age of the car app is upon us. You can use the manufacturer’s app for your shiny new BMW, for example, to find your car in crowded parking lots, light the lights, honk the horn, and defrost it when temperatures are subzero, all remotely. That’s the good news. The bad news is that apps mean security holes that can be exploited by bad actors, and they won’t just be flashing your headlights or sounding your car horn.
Many cities are already embarking on creative uses of systems, data, and applications, engaging with vendors, operations departments, and mobile device users to make life easier, be responsive, and cut expenses. However, security remains a critical issue. Among other things, smart cities multiply the opportunities of data theft and system sabotage for bad actors lying in wait for a chance to strike.
Gordon Gekko, lead baddie in the movies “Wall Street” and “Money Never Sleeps”, would surely feel tune with today’s cybercriminals. Virtual villainy doesn’t sleep either. Whether it’s a zero-day threat or some other app or software vulnerability, there’s always a hacker out there somewhere that’s ready to exploit a security defect, day or night.
Nobody (except a bad actor) wants their mobile app to wreak havoc on unsuspecting users, or expose them to glaring flaws in security. By looking inside the app to analyze the software, you can see how it behaves, but what about the domains and networks it links to, or apps that share some of the same code? How safe are they?
Don’t get us wrong – manual testing of software is still an important and valuable part of quality assurance, leveraging human intuition and inventiveness for proper coverage. But there’s a penalty to pay as well. Manual testing takes time and effort. When good automation can handle a large part of what would otherwise have been manual testing, in minutes instead of days, and at a fraction of the cost, insisting on manual testing could be a significant error.
Many apps ask for information and permissions at the time of installation to track user activity, and users often consent to these requests. However, when such an app is deleted by the user, such tracking should stop as well. Instead, using a technique known as fingerprinting, app creators can illicitly track mobile devices even after the app has been deleted, or before it is reinstalled.