Update 2021-08-06 : An Akamai notice [backup] posted some months after our publication suggests this ARL issue will stop working globally 2021-08-10. They also wrote an explicit vuln explainer. [gated link] Companion repo and links have been updated to reflect this sunset.

Apify bug bounty money bagz

In my last post I tried a specific hack over a handful of known bug bounty targets, like a few thousand.

From elsewhere in the multiverse came sounds of Big Data kids laughing —

“Only a couple thousand? You suck!”

😓

It occurred to me too that some “errors” in that simplistic process could’ve been WAF tarpit’ing versus my Python requests User-Agent.

Yeah silly me, I hadn’t even changed the User-Agent…

Akamai PowerShell running

We’re going to redeem ourselves now by leveraging Apify — a headless browser manipulator’s dream — and many more targets.

^^ That Project Discovery/Chaos scrape has more than 5 million valid bug bounty hostnames.

No more scraping by hand

I used to write scrapers by hand. You can see my OddsPortal.com and Nitrogen Sports jobs were popular on GitHub.

But nowadays this gives me no pleasure. Not when there are low-code and no-code solutions like Apify.

There are probably similar things/competitors though I am not familiar with any.

Old guy hammering on Firefox logo

Our vuln checks

Like last time, we want to execute the same initial check across our targets —

# Request to http://subdomain.example.org
  # Response code 400
    # Response protocol HTTP/1.0
      # Response header "Server: AkamaiGHost"
        # Response header "Content-Type: text/html"
          # Response body includes "Invalid URL"

Then if that goes well, try a payload as seen in the original GitHub repo. Again I don’t want to clown on the XSS-vulnerable site too much.

We’re looking at the response to this follow-up request for —

# Request to http://subdomain.example.org/<PAYLOAD>
  # Response code 200
    # Response protocol HTTP/1.0
      # Response header "Server: Apache-Coyote/1.1"
        # Response body includes "reallylongstringtomakethepayloadforxssmoveoutofview"

My first take was Python though. We now need to write some server-side JavaScript for Apify.

Setting up Apify

Getting a basic account doesn’t cost anything, and can be as straightforward as using GitHub to authenticate.

So you’ll first do that -> my.apify.com

my.apify.com

Then we’ll orient ourselves with their Puppeteer Scraper.

Click the Try for Free button and that will put us into their task editor, below.

Apify task editor

Our 1337 haxxor senses will guide us through many of the prompts here.

  • Proxy configuration?
    • Apify Proxy (automation)
  • Proxy rotation?
    • Use recommended settings
  • Browser masking
    • Use Chrome?
      • Yes
    • Use Stealth?
      • Oh yes
  • Security
    • Ignore SSL errors?
      • Always do
    • Ignore CORS and CSP?
      • Yeah, who needs those

And you can quote me on those recommendations.

Now, there are some parts of this initial task editing view that require more careful consideration.

  • Start URLs
    • Upload text file
      • You can upload a .txt that combines those aforementioned lists converted to URL format
      • Or — for the less brave people — start with a tiny subset

Here’s a simple Python script to turn those 2 big lists into a bigger list with an http:// prefix.

Either that’ll give us a definitive response code later on or we’ll have Puppeteer follow the redirect (assume https://) to somethind definitive.

exhaustive_bug_bounty_targets.txt

Seriously, Apify will take this 180 MB chonker. Configure that list as the only URL entry in the task editor then get rid of their “pseudo-URLs” default entry too.

Apify bug bounty start URL list

Under that you’ve got “Linked selector” with a default value of a. Very important here is that we void this field out.

Effectively, that’ll keep the crawler from trying to crawl. We just need top-level testing for this Akamai thing — zero depth.

Next we need to set the “Page function” with the logic needed to execute on each target. This GitHub gist shows my approach.

Significantly, we have to return an object of the data we want Apify to record in the result set. That’d let us know where to submit bounties. 🎳

Performance settings I recommend are shown below, with green indicating which fields were adjusted away from defaults.

Apify Akamai utility's performance settings

The very final thing in this section would be to hit Run, because then…

Robots will do the work

Yep, robots will do the work.

Apify Akamai hack running

As it runs you can preview the dataset, which shows the last 100 entries.

And as you can see in my screenshot above, there are many options as to output format when it’s done.

Kick back and listen to the Mortal Kombat 90s techno song or something.

Aftermath

The not-automatic part is searching around your results and then working backwards to contact eligible programs.

A preferred method for this will vary from person to person… Microsoft Excel is fine for me, after the Apify export in Excel format.

Closing thoughts

I think stuff like Apify mostly gets used today for growth marketing. However, there are advantages to having it in your security tool belt.

Here we just tried a simple hack — sometimes that’s all you need.

This was like my 2nd day trying to do bug bounty so I’m still learning too.

Anyway, creativity is what separates us from robots, right?

Happy hacking 🥂

BugCrowd dweeb money pit


Randy Gingeleski - GitHub - gingeleski.com - LinkedIn