Copyright : https://www.misecurity.net/content/images/size/w960/2022/10/image.jpeg

TCM — Practical Ethical Hacking Course — Information Gathering (Reconnaissance)

Passive Reconnaissance

Shivansh Seth
4 min readAug 6, 2023

--

  1. Physical / Social → Information gathering without accessing the target directly. Information that can be extracted are pictures, employees, network etc.
  2. Web / Hosts

a. Target Validation → WHOIS, nslookup, dnsrecon

b. Finding Subdomains → Google Fu, dig, Nmap, Sublist3r, Bluto, crt.sh etc.

c. Fingerprinting → Nmap, Wappalyzer, WhatWeb, BuiltWith, NetCat

d.Data Breaches → HaveIBeenPawned, Breach-Parse, WeLeakInfo

Identifying Our Target

We need to first look up the Rules of Engagement for any potentital client we’re hacking into.

Keep a look on the In-Scop and Out-Scope while hunting.

Discovering Email Addresses

  1. www.hunter.io → Used to gather email id’s for a particular business or organization
  2. phonebook.cz → A bit old UI but works efficiently. Here, we can get Email-Addresses, Domains and Urls
  3. https://www.voilanorbert.com/→ Here we get 50 leads for free, then it becomes paid
  4. CLearbit Connect → It is a chrome extension which helps us to find the email accounts easily
  5. https://tools.emailhippo.com/ → It is a website which helps to check us that the email address we found has some potential credebility or not
  6. https://email-checker.net/ → This works same as Email Hippo. This can be used to create your own script to make things easier
  7. Forgot Passwords Section → Sometimes you can just type in an email address in that mailbox section and check for potential existence of an email

Gathering Breached Credentials with Breach-Parse

https://github.com/hmaverickadams/breach-parse

This is Github Reposiory of the Breach-Parse tool. It contains a link to password list which we can downlaod if we want to using the torrent file given.

The tools simply hovers over all the email addresses and password combinations present within the list of emails provided in the files.

Hunting Breached Credentials with DeHashed

DeHashed is a webpage that provides hashes for passwords, emails and everythng we woudl require. But this tools is paid. Once we get the hash then we can find what kind of hash is used using the website called : www.hashes.org

Web Information Gathering

Hunting Subdomains

  1. Sublist3r → It is a famous tool used to gather Subdomains for a particular website.To install it we are required to type in the command in the CLI : apt install sublist3r.To get subdomains out of it we need type in the command : sublist3r -d domain.com.To run it quicly, we can use multiple threads to get the job done by using the comand : sublist3r -d domain.com -t 100
  2. crt.sh It is a website that help to get subdomains present. It also helps to find sub of subdomains even.
  3. Amass → It is a famous tool when it comes to Bug Bounty. It takes a lot of time to enumerate the subdomains but it is really efficient

NOTE → All these websites may or maynot work, so we can use tomnomnom/httprobe and tools like that to check on them.

Identifying Website Technologies

  1. www.builtwith.com → It is used to know about the web technologies used by a company. It tell about the frameworks, mobile technology, widgets etc.
  2. Wappalyzer Extension → It provides a little bit information about the technlogies used by the website
  3. whatweb → It is a CLI based tool that provides us with the website’s technologies

Information Gathering with Burp Suite

Webproxy has the capability of intercepting web traffic.

Steps to be followed :

  1. Go to settings > Change your proxy to the proxy you like for standard purpose it is : 127.0.0.1
  2. Then visit the page with url : https://burp
  3. Download the CA Certificate present at the top right corner
  4. Then again go to Privacy and Security > Certificates > Import. Import the CA Certificate that you just downloaded.

In the proxy section, we intercept the traffic and all the requests and responses are showed in here.

We can intercept the traffic like by changing headers and stuff.

Then on jumping onto the Target Section, we get all the targets we intercepted traffic for.

Google Fu

This is also called Google Dorking(if I am right).

Now, we will be learning some basics about it :

  1. site: domain.com → This will provide us with all the webpages present at google with the domain in their URL.
  2. site: domain.com -www → This can be used to hide the www results from the google results.
  3. site:domain.com filetpe:docx → Now this will search for the documents present in the search engine published or uploaded by the domain’s comapny

Thankyou✨!!

Clap👏, if you found it a good read.

--

--

No responses yet