Guide to using hopper disassembler1/9/2024 ![]() After creating such a script, an attacker would be able to automatically test all the apps and find a lot of these secret values. If the app doesn’t have any obfuscation, an attacker could automate a secret keys search. On the other hand, it might be really easy. But in the end, it will still be possible to get them. ![]() It may be really hard, if the app is using paid solutions to obfuscate the code and encrypt every string in the app. In this way, attackers should never get those API keys, and only you will be able to communicate with your backend. This endpoint should be authenticated with a user token and implement proper security requirements. If you need to communicate with a sensitive, external API, ask the backend to create an endpoint to do that. Such notifications could contain malicious links and install malware applications, just as one example. Using the Firebase cloud functions server key, an attacker could send notifications to every app user. Android, iOS, and browser keys are rejected by FCM.” Also, make sure to use only server keys to authorize your app server. “Important: Do not include the server key anywhere in your client code. Let’s consider an application that has a key that allows sending money to or from a Bitcoin wallet.An attacker could find this key and send all the money out of your account, or even use it as a chain for money laundering. Keys providing access to payment services.It could result in losing large amounts of money in a short time from your account connected to the SMS gateway key. Let’s imagine a service in which an attacker could send an SMS using your hard-coded key and subscribe to a spam service that will make them money at your expense. There are a lot of articles about AWS API key leaks on the internet, e.g. You can see that many companies received big penalties for violating the GDPR. As a result, you will leak confidential data and may get a fine for breaking GDPR or other data protection laws. It may contain read access permissions, which allows an attacker to download the whole database stored there. One of the best examples here is the AWS API key. Keys providing read access to confidential data.An attacker can also generate such a key for themselves, so there is no reason for them to use your key. In such cases, there is no real risk, as API requests are free and the stored data is public. Keys providing read access for public data from a free API.While analytics keys don't lead to much risk, leaking any of the other mentioned keys may lead to serious consequences. API keys may provide access to third-party services like AWS storage, SMS gateway, payments API, or analytics. Hard-coded, sensitive data in application binary can always leak and could be used to harm your business. There are also CWE issues describing the subject. ![]() There are much more sensitive keys that should not be stored directly in code and you can find a lot of articles describing such cases. That number is for sure bigger than 0.5%, as in this article they only consider AWS API keys. This article claims that 0.5% of mobile applications contain AWS API keys which has resulted in the exposure of 100M+ users. The problem of considering compiled mobile applications as safe storage is real. That's why you should never store or hard-code sensitive keys inside your app. However, an experienced and motivated pentester will find it really easy to extract such keys and even automate the process of extracting hard-coded values. When you look at an application’s compiled code, it seems that it's not readable and no hard-coded values can be found. Mobile applications are often viewed as safe storage by some developers.
0 Comments
Leave a Reply.AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |