no server side. When possible we will refer back to OWASP categories to be more clear. • Concrete approach, not theoretical only, with real world examples. • Explain the tools used, and why Jailbreak is important doing these assessments, but nullify your device security (more or less). • Try to share the passion for Reverse Engineering. 4
attacks that we speak about because they affects every user of your mobile application, without even have to make distinction of having physical access to the device, or if it’s jailbroken or not. • If the informations leaked are valuable the risk of this vulnerability is usually high, because it’s easy to exploit. It can also lead to the exploitation of other vulnerabilities (like malicious content injection) 6
traffic passes from your iOS device to the server, maybe somebody is listening? • Anyone on your network that can somehow access your traffic. (Sysadmins? Other users passively or actively listening for other’s traffic?) 13
in transit • It’s not that hard :) Use SSL/TLS (i.e. use HTTPS instead of HTTP and don’t pass stuff in flying GET parameters for example.) • Don’t disable certificate validation! At least be sure to don’t do it in production! Otherwise your implementation will be pretty much ineffective. 14
accepted if it’s signed by a CA that ships in iOS. • If you disable the validation, any certificate will be accepted. • If you disable it an attacker can MiTM the communication, intercepting it and see everything, resigning the traffic going to your device, and you will not be able to verify if you are speaking with the legitimate server without someone in the middle! 16
• Scenario: my app transmit very sensitive data (banking apps for example) • I don’t want that an advanced adversary that can somehow sign certificates with a valid CA can read my traffic. • Solution: Certificate Pinning. 18
anymore on what is signed by CA, instead you can verify your certificate against a “known good” set. • You are in control and you can choose to trust exactly only a certificate: yours. • https://www.owasp.org/index.php/Certificate_and_Public_Key_Pinning#iOS
sensitive data at rest or not save them at all. • On iOS the private application folder is protected by the sandbox, but the device can be lost or jailbroken and those data retrieved. • Other forms of leaks of this informations can be backups and similars. • Regular users often reuse the same passwords for everything, so it’s enough a leak from an application to compromise lot of them. 20
A detailed explanation would require a big chunk of this presentation, if interested you can learn more in this excellent Apple document: http:// images.apple.com/privacy/docs/iOS_Security_Guide_Sept_2014.pdf • TL;DR: You can set on Files and Keychain Items a “protection level”, which determine when something is available in a decrypted form (for example: - always, -after the first login on the device, -or only after the device unlock, so they are encrypted when it’s locked). 21
another layer of encryption to your sqlite databases or core data, deriving the encryption key from a user passcode. • Sqlcipher is a encrypted sqlite, easy to use. • Keep an eye on iMas project which offers lot of libraries and components to help you with the development of security features 22 http://project-imas.github.io/ https://www.zetetic.net/sqlcipher/ios-tutorial
use standardised and tested crypto libraries developed by experts. • Don’t use static keys, generate encryption keys from user input (passcode or password) with a good level password derivation function and don’t persist them. • Don’t embed static keys in the code In the demo later
to take snapshots when the app goes in background. • They can leak sensitive data like login screens, or your bank summary. • You can grab them from the application private folder in your Library/Caches/Snapshots folder. • Use the callbacks like willEnterBackground: to sanitise the screenshot if sensitive data are shown. OMG it's my bank!
to IPC and interoperability between apps, especially in iOS 8. This exposes another attack surface, the Inter Process Communication one. • Custom URL Schemes: be careful with the untrusted data coming from outside • iBeacons: the identifiers of iBeacons can be falsified, be careful if iBeacons trigger actions in your app. • In iOS 8 you can create extensions. A whole new attack surface, but luckily there is maybe some useful expertise from Android which is adopting this from the beginning. 26
new projects are good • Be sure checking that you are generating PIE code and using ARC. • Position Independent Code makes harder for an attacker exploiting vulnerabilities to reliably find pieces of code or informations at a known address (the application is loaded differently in memory at each launch) 27 • Enable stack smashing protection adding -fstack-protector-all in “Other C Flags” • ARC not only greatly improve the development experience, it also simplify the memory management, mitigating potential vulnerability related to this problems (use after free for example) • Don’t embed secrets in the code.
party applications, even from Appstore by decrypting them at runtime, with publicly available tools like “Clutch”, or manually with a debugger • Run the binary through class-dump to retrieve the Objective-C class headers • It can be done also on Swift apps, the names are less explicit, but they can be deobfuscated • Good starting point usually
language, so are the tools and techniques to reverse it • It does not have metadata as pretty as Objective-c, but it has a lot of them. Class names must be “demangled”. • Do you really think this can stop a experienced reverse engineer that eats stripped, statically compiled binaries for breakfast? :) 32
app, it’s in the store and it has parts developed with Swift inside. • See this very interesting talk by saurik : https:// www.youtube.com/watch?v=Ii-02vhsdVk • mangled swift class name example: _TtC4WWDC15WWDCAppDelegate • Keep an eye on the community in the next months for more resources on Swift Reversing 33
Swift leak more informations about classes and user methods than wanted. • If you have important intellectual property to protect, you may want to code it in C instead of Objective-C or Swift, the compiler will optimize it and strip of informations that instead remains in Objective-C and Swift Code. And maybe apply an obfuscator on critical parts. 34 Keep Any Eye On: https://github.com/obfuscator-llvm/obfuscator/wiki
logic to check if the pin is correct. • Take the pin and calculate the md5, and check it against a stored md5 of the right pin • if correct it dismiss the ViewController, granting access to the app.
controller dismiss even if we don’t have a valid pin (trivial) • 2. we can retrieve the valid pin md5, then crack it trivially, there is no salt and the pin is only 4 digits (we will do this, just to cover also this easy to crack pin implementation) 2 obvious approach that we can take:
recently. • Our app can be notified if the user successfully authenticate itself successfully with biometrics thanks to the touch-id. • It would be cool to use it right? 40
problem is that the entire authentication is performed by code running on the device, so after a jailbreak it’s possible for an attacker to influence it, and bypass our authentication, making the code behave like the authentication is successful. • So this feature must be used carefully and it’s NOT a silver bullet for authentication in our apps. • Apple as well require the passcode at least once (so something that the user knows)! 42