How to JS for Pentest: Edition 2023

Kongsec
5 min readMay 18, 2023

Hi everyone,

I am Aditya Shende (Kong) from India. A Bounty Hunter , Biker and Researcher.

This is my 13th article , If you found any spell error. Let it be….. Lets start

JavaScript (.js) files serve as repositories for client-side code and can function as the fundamental framework of websites, particularly in contemporary contexts. With the evolution of technology, I have observed an increasing prevalence of substantial data stored within .js files on websites. When inspecting the source code of a website, it is common to encounter references to main.js and app.js, which correspond to ReactJS. These websites heavily rely on JavaScript and employ Ajax requests. These files encompass comprehensive information about the application, while also utilizing specific and distinct JavaScript files for each endpoint, as depicted below.

.js files are frequently underestimated due to their inclusion of intricate and unfamiliar code, which may appear nonsensical. However, by employing targeted keyword searches, valuable information can be extracted. Over time, as one familiarizes themselves with JavaScript and comprehends its workings, a clearer understanding of these files and their functionality will emerge. This is where an understanding of code and proficiency in JavaScript proves invaluable in the pursuit of bug bounties.

Locating .js files:

The process of finding .js files is relatively straightforward. One approach is to right-click on the web page and select “view source” (or visit view-source:https://www.website.com/). Then, you can search for occurrences of “.js” within the HTML code. This method is suitable for manual hackers, as it allows you to identify .js files that exclusively contain code relevant to the specific endpoint you are exploring. In that case, you may come across a file named “config.js,” which is specifically associated with this endpoint. This file might unveil new API endpoints that were previously unknown to you.

When employing Burp Suite’s spidering functionality, you will encounter numerous .js files, which should be subjected to further investigation. Additionally, as mentioned earlier, if the target website utilizes ReactJS, you are likely to encounter main.js and app.js files.

The items you should be searching for include:

  1. New Endpoints: Look for any references to new API endpoints within the JavaScript code. These endpoints might provide additional functionalities or access to specific features that are not available through the web application’s user interface.
  2. New Parameters: Pay attention to any new parameters being utilized in the JavaScript code. These parameters may allow you to manipulate or customize the behavior of the application.
  3. Hidden Features: Sometimes, the JavaScript code may contain sections or functions that are not exposed in the web application’s interface. These hidden features could potentially provide additional functionality or access to premium-only features. Determine if you can interact with these features even without a premium account.
  4. API Keys: Look for any occurrences of API keys within the JavaScript code. These keys may grant access to restricted APIs or sensitive data. Make sure to handle them securely and avoid exposing them.
  5. Developer Comments: Explore the JavaScript code for any developer comments, such as single-line (//) or multi-line (/* */) comments. These comments may reveal valuable information about the code, such as the date of publication or updates.
grep -r -E "aws_access_key|aws_secret_key|api key|passwd|pwd|heroku|slack|firebase|swagger|aws_secret_key|aws key|password|ftp password|jdbc|db|sql|secret jet|config|admin|pwd|json|gcp|htaccess|.env|ssh key|.git|access key|secret token|oauth_token|oauth_token_secret" /path/to/directory/*.js

Make sure to replace /path/to/directory with the actual path to the directory where your .js files are located. The command will recursively search for the specified keywords in all .js files within that directory.

Please note that it's important to exercise caution and follow ethical guidelines when performing searches like this, ensuring you have proper authorization to access and analyze the files.

The provided command consists of multiple components and is used to discover subdomains, identify JavaScript files (with HTTP response status 200), and save the results in separate files. Here’s an explanation of each part of the command:

  1. subfinder -d domain.com: This command utilizes the tool called Subfinder to discover subdomains of the specified domain (domain.com). Subfinder is a subdomain discovery tool that uses various sources to find subdomains associated with a domain.
  2. | httpx -mc 200: The pipe (|) symbol is used to pass the output of the previous command as input to the next command. In this case, the output of the subfinder command is passed to httpx. The httpx command is used to send HTTP requests and filter the responses with a status code of 200 (successful response).
  3. | tee subdomains.txt: Again, the pipe (|) symbol is used to pass the output of the previous command, which contains the discovered subdomains with a 200 status code, to the tee command. tee is a command-line utility that allows you to both display the output on the screen and save it to a file. In this case, it saves the subdomains to a file named subdomains.txt.
  4. cat subdomains.txt | waybackurls: The cat command is used to read the contents of the subdomains.txt file. The output is then passed as input to the waybackurls command. waybackurls is a tool that retrieves historical URLs from the Wayback Machine, which is an archive of web pages. This command helps in finding URLs that were previously available but may not be currently accessible.
  5. | httpx -mc 200: Similar to the previous usages, the pipe (|) symbol passes the output from waybackurls to httpx, which filters URLs with a status code of 200.
  6. | grep .js | tee js.txt: The grep .js command filters the URLs to only include those containing the ".js" extension (JavaScript files). The pipe passes these URLs to tee, which saves them to a file named js.txt. The tee command allows displaying the output on the screen while simultaneously saving it to the file.

In summary, this command sequence combines various tools (subfinder, httpx, waybackurls, and grep) to find subdomains, retrieve historical URLs, filter JavaScript files, and save the results in separate files (subdomains.txt and js.txt).

The complete command is as follows:

subfinder -d domain.com | httpx -mc 200 | tee subdomains.txt && cat subdomains.txt | waybackurls | httpx -mc 200 | grep .js | tee js.txt

Please ensure that you replace domain.com with the actual domain you want to search.

Now you can grep for this : cat js.txt | grep -r -E “aws_access_key|aws_secret_key|api key|passwd|pwd|heroku|slack|firebase|swagger|aws_secret_key|aws key|password|ftp password|jdbc|db|sql|secret jet|config|admin|pwd|json|gcp|htaccess|.env|ssh key|.git|access key|secret token|oauth_token|oauth_token_secret”

Once you get the JS URLs you can use nuclei exposures tag on it get more sensitive information .

To run a Nuclei command on the js.txt file with the exposures tag, you can use the following command:

nuclei -l js.txt -t ~/nuclei-templates/exposures/ -o js_exposures_results.txt

Here’s an explanation of each part of the command:

  • nuclei: This is the command to run Nuclei, a fast and customizable vulnerability scanner.
  • -l js.txt: The -l flag specifies the file (js.txt) containing the list of URLs to scan with Nuclei.
  • -t ~/nuclei-templates/exposures/: The -t flag specifies the path to the Nuclei templates directory for the exposures tag. Adjust the path ~/nuclei-templates/exposures/ to match the actual path where your Nuclei templates are stored.
  • -o js_exposures_results.txt: The -o flag is used to specify the output file (js_exposures_results.txt) where the scan results will be saved. You can replace js_exposures_results.txt with the desired output file name.

Make sure you have Nuclei and the relevant templates (in this case, templates related to exposures) installed and configured properly before running the command. Adjust the paths and filenames according to your specific setup.

--

--

Kongsec

#kongsec | Solo Bounty Hunter | Function Exploits and Report Crafting | Bikes | Not a XSS guy | Own views | Bugcrowd Top 100 l Top 10 P1 warriors | Biker