Hacker News new | past | comments | ask | show | jobs | submit login
Evading JavaScript anti-debugging techniques (nullpt.rs)
236 points by hazebooth on Aug 1, 2023 | hide | past | favorite | 51 comments



Interesting though it involves recompiling the web browser. I have encountered this issue on many websites and my response is to stream the website through a proxy server which can then save the content (both outgoing and incoming) to the local disk for analysis. Using the browser's debugging tool is a lost cause when you're dealing with obfuscated code. The approach that I use is to isolate the target JS, modify it by including calls to a websocket, save the code to disk and instruct the proxy server to load the code from disk instead of from the website. This way the website appears to work normally except with my modification. In some cases, it may be necessary to isolate an additional file or two due to dependencies.

The reason for the websocket is that the browser console is also rendered inoperable due to the debugger statements and console clear commands emanating from the website JS. A websocket is then the only way to transfer actionable information (such as a password or a secret link). It's not an easy or quick process but, by inserting websocket calls in interesting places, it is possible to figure out what the JS is doing. It also helps a lot to prettify the JS in order to study it. There are websites that can do that for you. Unfortunately, the prettification of the JS may break it so you're still stuck with doing the modifications in the original JS.

I built my own proxy server for this task but I imagine that the same may be possible with a tool like HTTP Toolkit but that means getting the Pro version.


In the VS Code JS debugger, there's an option to "exclude caller" on a call frame that which prevents stacks with the given caller from pausing at a location. As mentioned elsewhere, browser devtools have something similar with "Never pause here." Do you think there's more than tools can do to make your process easier?

I maintain the vscode debugger and found both the article and your comment interesting--there's a large overlap between "programs with anti-debugger techniques" and "programs that are hard to debug."


The overlap would be due to the JS obfuscation. This makes it both hard to debug and hard to run the debugger. What is needed is a way to unravel the obfuscation. This is mostly driven by a massive lookup table which contains text strings to be substituted for the coded vars in the JS. For example, a var called _0xff09b8 might be the code for 'toString'. Harder examples may involve coded vars that are used to call a function which generates the array subscript needed for the table lookup. It is literally mind-bending.

What I'm saying is that we need a way to get that table (array) and perform the substitutions in order to recreate the original code as text instead of numbers. This is likely way beyond the scope of a debugging tool. Or is it?


> Interesting though it involves recompiling the web browser.

Years ago I really wanted to disable the blink tag, so I just ran `perl -pie "s/blank/abcde/g"` on the binary and that worked well enough.

I'll bet you could so something similar with "debugger". On macOS, you'd break code signing, but you could re-sign it or strip the signing and let it run unsigned.


What makes it special in macos?


macOS verifies signatures of all apps. Windows comes close, if you configure enough GPOs, otherwise it just gives you a warning. (If you've seen the "unknown publisher" warning that literally everyone immediately clicks past, that's what I mean.)


Yeah. I used to do that sort of stuff a lot in my younger years, lol.


Been doing stuff similar to this for decades(?) using the Fiddler proxy. It does so much stuff browser extensions or browser inspectors don't. One of my most important tools for website debugging/hacking/workarounds.


Would the "Local Overrides" feature of chrome devtools simplify this workflow for you?


Local Overrides does what? Problem is that the devtools are not available due to the repeated abuse of the debugger and console clear commands. The other problem is storing content on the local disk for study. I don't think devtools do that.


Local Overrides stores the files you chose to override in a folder of your choosing. Subsequent requests for that resource while devtools is open will replace the contents with your local copy.

So the idea is store it in local Overrides, find the bad anti debug code and remove it, then you get back full control in devtools.


That might be useful in some simpler cases. Lately, I've been hacking into some really hard stuff that pretty much required the use of two web browsers due to caching, garbage popups and other matters. One browser doing the debugging and the other that I could cold-start without losing any work. The proxy server makes this division of labour possible.


> Problem is that the devtools are not available due to the repeated abuse of the debugger and console clear commands.

Sounds like we need a way to disable web site access to those commands.


Couldn't you load the site with devtools open and javascript disabled, add the override, and enable javascript?


I'm surprised to not see Chrome's handy "Never pause here" menu that appears when you right click any line of JS, including debug breakpoints. This is typically what I do when there's a debug in an intervaled function (simple anti-debug commonly found on some video sites).

Example: https://i.imgur.com/BsphnEu.png


I knew I forgot to mention something :) I do love this option but I wasn't able to get it to work with this obfuscation technique either.


I don’t think that works with eval code because it doesn’t have a file:0:0 address


That would make sense.


thats interesting. I've always seen that but always ignored it, not knowing what I could use it for


Unfortunately that won’t be an option with Web Integrity….


Glad you have said it.

And before a developer for these commerce websites jumps up and says “ah but supreme are trying to prevent bots from buying up all of their merch and scalping it”:

Supreme are restricting supply so they can maximise profits.

They are selling on the web rather than through traditional retail outlets using this method not to reach a wider audience for the audience’s sake but to have a larger number of people who are willing to pay an even higher price.

The web, the system that brings free information to the masses requiring no knowledge of the underlying technologies, is too important to compromise for these e-commerce platforms attempting to have their cake and eat it to.


Also note that as far as the sketchiness scale goes, this is basically a 5/10. Now consider the same tools in the hands of malware distributors. For example, I've seen these anti-debugger techniques on NFL piracy websites when I tried to investigate why my CPU was pinned to 100% while I was streaming the game.


> I've seen these anti-debugger techniques on NFL piracy websites when I tried to investigate why my CPU was pinned to 100% while I was streaming the game

Probably safe to assume they were mining cryptocurrency with your browser while you were watching the stream.


Probably. I got distracted by the game so I never found out. And I only visit those websites when I'm watching a stream anyway. Really it's quite a clever distribution vector, since it involves long sessions and any experts in the audience will be too distracted to look too deeply, or at least unmotivated to investigate and publish research against a site that helps them watch the games they want to watch. It's a symbiotic relationship in a way...

I did notice the ad serving infrastructure seemed quite sophisticated. There were so many domains and proxies and redirects. Luckily uBlock Origin blocks almost all of them. And usually, I can avoid any of the "bonus" features by opening the video player iframe in its own tab (but sometimes this isn't possible, or the video player tab has some scripts to make it annoying to run in isolation).

One thing I like to do during the commercial breaks is paste the URL of the site into GitHub Code Search. This always leads to interesting results, including blocklists, people's personal media scrapers, or sometimes even the (re-)publishing infrastructure of the sites themselves. It's also a great way to find alternative URLs or other streaming sites.


Scalpers only exist because they are not raising the price to reduce demand. If they just wanted a higher price there is still room to go higher.


Ah yes, the supply vs. demand argument. My favorite.

[x] only exists because it is underpriced relative to market demand.

A great question to ask after this is "would i be okay subjecting my child/mother/father to this experience?"

For example, there are 2 tennis courts available on a first come, first serve basis at a park in SF. Because they are free to reserve, and there's more demand than supply, people will bot all the courts at all given times during the day and scalp players for the free reservations.

Or, a car is available for [x] msrp price, plus a 25% "market adjustment" fee. Is this ethical for every dealer in the country to do the same?


>Is this ethical for every dealer in the country to do the same?

Yes, if it means that a car that I want is in stock as opposed to having to wait for 6 months I would consider it ethical.


It's in stock. You can't afford it though unless we're being charitable by assuming you're richer than the whales that such an economic practice being pervasive creates.


Modern laissez faire capitalists don't care ethics.

My memory of learning about "Wealth of Nations" style capitalism is there was an idea that people produce goods and services that other people find useful. So when they trade, both buyer and seller benefit. As opposed to scalpers, who interpose themselves between the two to the detriment of both the buyer and seller, and benefit only to the scalper.

Modern capitalist don't care if they make everything worse, only that someone is making money.


There are opportunity costs to decreasing production. There are points where you are achieving economies of scale for your suppliers that allow you to negotiate lower prices. Scalpers could be seen as a perverse source of marketing. I am not going to pretend though that I have a very strong understanding of luxury streetware firms.

I appreciate you attempting to introduce some nuance to a discussion about goods. I would imagine that Supreme’s marketing team would be pretty savvy and know how to hit the mark.

My point is more about their determination to run obsfucated code on their users computers and the intersection between that and google’s vision for the browser.


For people who don't want to compile the anti-debugging firefox themselves, I have set up a github repo to do it automatically: https://github.com/Sec-ant/anti-anti-debugging-debugger-fire...


>By renaming it to something like "banana," the debugger would no longer trigger on occurrences of the debugger keyword. To achieve this, we built customized version of Firefox.

heavy handed approach. I have some moderate success intercepting setInterval/setTimeout and manually sifting to find that one call that starts the ball rolling. Things get old fast when the code you are looking at looks like

    0[_0x199d1e(0x815*-0x2+0x1735+0x13f*-0x5)](_0x199d1e(0x3b3*0xa+0x1c1+-0x260d),_0x199d1e(0x2149*0x1+0x9f7+0x1*-0x29f5)))[_0x


Just record a Replay (https://replay.io). Done!


> Once upon a time, whenever you tried to open your devtools on Supreme's website, you found yourself trapped in a pesky debugger loop.

Could somebody here explain what that means, since the article doesn't? What's a debugger loop? What is the actual JavaScript code that somehow prevents debugging, and how does it accomplish that?


Using a `debugger;` statement allows you to trigger a breakpoint with code.

This only gets activated when the devtools window is opened, so placing this statement in a frequently executed piece code will continuously interrupt whatever you are doing in the devtools when you use them.

I assume in the past the tooling might not have had the necessary configuration options to suppress that, but nowadays you can just disable debugger statement breakpoints to avoid it.


The Javascript statement is simply "debugger". Very easy to abuse. Of course, there are other techniques for breaking devtools. There are JS libraries designed for the purpose of detecting that the dev console is open. The response may be to run the debugger command, freeze the code, reload the web page or, worse, do some serious hanky-panky (it's not hard to crash the web browser; an endless loop can do that).


> Problem is that the devtools are not available due to the repeated abuse of the debugger and console clear commands.

What methods do they use to detect debugging tools and how do we defeat them?


Timing analysis is probably going to be the most reliable and annoying. I've heard they also detect when the window size suddenly changes, but that sounds ridiculously fragile and easy to defeat.

Difficult to imagine any anti-debugging techniques that will work against something that just records an execution trace.


They don't need to detect the devtools being open, debugger is a no-op when the devtools aren't open so you can just run it in a loop forever.


The debugger command abuse can be defeated (as already mentioned in this thread). A devtool detector is used in order to invoke stronger blocking methods such as forcibly reloading the web page until the console is closed.


Search on GitHub. I don't know how to defeat them. I just don't use dev tools.


The SANS course for this still teaches to use IE for debugging JS because it is the only browser that lets you break at arbitrary points in the code instead of newline boundaries.


Chrome has breakpoints within the line too. They aren't particularly great though.


note if all you care about is capturing the web requests, you can use something like MITM Proxy:

https://github.com/mitmproxy/mitmproxy


You can also use a MITM proxy tool to intercept the JS files and modify their response body to remove or replace the `debugger;` statements with something else. Might require inspecting the JS files first to see what needs to be replaced exactly, but should not take more than a few minutes.


That will not pass integrity checks (the script inspecting its own code).

It will also not work if the script is some initially obfuscated string that is passed to eval() or something more complex assembling the actual code on the fly.


That will not pass integrity checks (the script inspecting its own code).

As us "old school crackers" would say, "NOP those out!"

As for obfuscation, you can unpack the scripts in order to do the needful, then use the proxy to "transparent redirect" requests for them to your own locally hosted unpacked and modded version.


>That will not pass integrity checks (the script inspecting its own code).

I've not seen anything like that. The integrity checks are generally limited to verifying the document location and the presence of certain elements in the DOM. Obfuscation techniques have become so sophisticated that integrity checks are not really necessary. Bot challenges (such as the one used by CloudFlare) may go so far as to test graphic elements like the canvas to ensure that the JS is actually running in a browser but I don't think this is a common thing for the average website that just wants to keep bots from scraping them.


This assumes that the script contains the word "debugger" in clear text, however it may not. It may decrypt or descramble a string and then eval() it. Your approach wouldn't catch that.


This won't work with obfuscated JS.


Ah yes, the eternal arms race.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: