Blog: How Tos

Revisiting old tools

David Lodge 12 Jun 2020

Many, many years ago I was onsite and noticed that a company’s internal website had checked out their website using the subversion code versioning system. This subversion archive contained the site’s web.config which has a set of credentials for SQL server, which through many steps led to domain admin. I originally wrote this up as a blog post.

This formed the basis of my first ever presentation (at East Midlands OWASP) and ultimately led to a blog post and tool, which I then expanded to include mercurial and git.

Other people have since produced different versions of this tool and I’ve seen this in several CTFs.

So why am I writing about something I did 7 years ago? Recently one of my colleagues came across a Chrome plugin that would automatically check for /.git/config on sites that your where browsing.

Some internal discussions and browsing led to a load of us installing this extension and identifying a number of sites which had used git to check out and were leaking information, including secret keys and database credentials.

This is interesting for a couple of reasons.

Reoccurring vulnerabilities

Firstly as it demonstrates that old vulnerabilities do not die and can re-occur – we’ve seen similar with IDOR (Insecure Direct Object Reference) where it virtually disappeared and then came back, and to a lesser extend with SQL injection. Now we’re seeing it with version control systems.

These are simple vulnerabilities that occur because it is easier to do an action in the insecure fashion and that the user just hasn’t thought about the security impact. They are also trivially simple to detect and exploit.

This re-emphasises how important it is to include some very basic security testing in release processes. In the git case, just running any web scanner (e.g. Nikto) would’ve have identified this problem.

Passively scanning

The second reason is that it shows the potentials for passively scanning for vulnerabilities. This is something that others have experimented with before. For a long time I varied my user agent between and XSS and a SQL injection (and yes, I did find some vulnerabilities that way).

But now that we have extensions it’s easy to kick off some simple alerts about the basic security stance of a site as you browse it. This may be useful for those who engage in bug bounties

Tool update

All the exploit tools for /.git directories out there require that the whole archive is downloaded locally and then extracted. This can take a lot of time. With that in mind I updated my git-grab program to port it to Python3 and improved its interface so that it will cache data and allow simple viewing of the files that could be in the repository and their contents.

The tool can be found at my github account.

The interface is now:

usage: git-grab [-h] [--cache [CACHE]] [--verbose] [--outdir OUTDIR]
                url action [files [files ...]]

Abuse .git repos on web servers

positional arguments:
  url              base URL of site
  action           Action to perform: ls, download, view
  files            list of file globs

optional arguments:
  -h, --help       show this help message and exit
  --cache [CACHE]  Directory to cache downloaded files
  --verbose        Be verbose
  --outdir OUTDIR  Directory to store output

Three commands are available:

  • ls – list files in the git directory
  • download – download a glob (a pattern) of files in the directory
  • view – display the contents of a pattern of files to the screen

Some sample examples:

List all files in a .git directory:

[[email protected] ayfabtu]$ ./git-grab https://xxxxxxxxxxxxxx ls

View a file in the directory:

[[email protected] ayfabtu]$ ./git-grab https://xxxxxxxxxxxxxx view info.php
// we should remove this

Download a git directory to ./domainname:

[[email protected] ayfabtu]$ ./git-grab https://xxxxxxxxxxxxxx download
Downloading admin.php
Downloading index.php


The take-away here is nice and simple – don’t forget old tools.

If you wrote something that was useful back-in-the-day revisit it. If there is still a call for it then rework it, and keep it relevant and helpful.