In my article Gathering & Analyzing Metadata Information I empasized the security risk of hidden metadata info of publicly shared documents and how this info can be gathered massively through certain tools. So I begun writing a series of articles in order to analyze the different types of file metadata and what tools can someone use in order to view and edit/remove them. In the first part, I analyzed the case of exif jpeg metadata and in this article I will continue with the famous Portable Document Format (PDF) file, presenting the appropriate tools to handle the metadata information.
Any organization or individual who sends or receives files (documents, spreadsheets, images etc) electronically needs to be aware of the dangers of hidden metadata. Metadata information includes user names, path and system information (like directories on your hard drive or network share), software versions, and more. This data can be used for a brute-force attack, social engineering, or finding pockets of critical data once inside a compromised network. Thwarting an attacker’s attempts to exploit the metadata easily found on your company’s or personal website, in digital documents, and in search-engine caches is hard, if not nearly impossible.
During the information gathering and reconnaissance phases potential intruders spend a great deal of time learning everything they can about their targets before they launch an attack. The gathered information is often crucial in order to find a weakness in a system or network and users participating in them. Hackers can gather useful information by examining the Metadata (data about data) content of files that are used by the victim user, system or network. In a try to enumerate all the possible Metadata leakage cases, i will write a series of articles covering different filetypes & file groups. In the first part i examine the exif jpeg metadata and how to handle them.
Reverse DNS Lookup process reveals the domain names that are associated with an IP address. Website owners/maintainers sharing hosting resources and penetration testers extensively use Reverse DNS Lookup to find out the domain names that are listed in the target host. A lot online lookup tools exist (like ip-address.com and domaintools.com) but i prefer to use a relative software tool for this work. Recently i come across in the web with the DRIL tool.
The growing of GPU power has brought to the forefront brute force password cracking by significantly decrease the demanded time. For a single CPU system that runs a bruteforce tool (like John the Ripper) it will take half a year to break a seven full character (uppper, lower, digit, special) password. On the other hand, a modern system with high performance GPU(s) can break the same pass in less than a day. However when facing 9+ character length passwords, bruteforcing time is greater than a security audit can allow for. Usual password policies that are adopted in enterprise environments, demand a 8+ character containing at least one uppecase, special character and digit. The most common scenario is to capitalize the first character and add a number or special character at the end of the password. An other common scenario is to use special characters to substitute some characters (like a->@, i->!, s->3). Knowing such possible scenarios you can profile passwords & launch masked attacks through user specified rules, resulting in a significant decrease to the cracking time.
Skipfish is a fully automated, active web application security reconnaissance tool released by Michal Zalewski (lcamtuf) . Web developers and security professionals can use skipfish to run a series of tests to websites that are under their responsibility. Skipfish support Linux, FreeBSD, MacOS X, and Windows (Cygwin) environments (i have made my tests under Debian distribution). The tool has been released to the public by Google, in order to offer an easy to use and high speed solution for making websites safer.
Regularly while we are surfing on the web we come across with a database cracking incident. If we take a closer look to the cracked passwords we realize that the top used passwords are pretty much the same. These kind of incidents lead us to the conclude that when it comes to passwords humans are predictable. Usually the chosen passwords are unoriginal, using variations of commonly used words (@dmin, p@33word, Password123 etc) and their length is from 1 to 9 in most cases. And on the top of that, many people use the same passwords across all of their web accounts and services, giving hackers access to everything they use. Most people think that they made a unique choice by changing common passwords to something slightly different (like password –> P@ssword321). Cracking tools that implement dictionary attacks with a rule engine, can very easily find these type of passwords. Until recently, the most commonly used dictionary and rule engine tools (like John the Ripper, HashCat etc.) are CPU based. Using modern multicore GPUs we can achieve more than x400 performance acceleration. And this is were cudaHashcat+ and oclhashcat+
Network security is crucial in the privileged access VLAN of the Computer Center Laboratory administration and support groups. Some network plugs are located in public accessed places creating a security hole in the internal network. A usually established solution in enterprise networks, is to create MAC rules for each network interface. Meaning that each network plug has access to the network only from the permited MAC address of the matched user, preventing MAC spoofing attacks. Although, the working conditions (frequent staff moves, hardware changes etc) of our infrastructure make the above solution ineffective (great overhead to the NOC admins).
Tiny Encryption Algorithm (TEA) is a block cipher designed by R. Needham and D. Wheeler of the Cambridge Laboratory. TEA is notable for the simplicity of its structure and implementation, typically done with a few lines of code.
As part of the network administration in the Network Operation Center (NOC) of the Computer Center Laboratory, we must secure the server & network infrastructure from internal or external malicious activities. Crucial base server & network nodes have been independently secured against the majority of the attacks. But the challenge is to be able to secure the non-crucial hosts, such as the user’s machines with access to servers and network devices from lower levels of the infrastructure. The obstacle to the whole concept is that we are on an university network, which include people with different levels of computer & IT knowledge, plus the big number of services and experimental technologies that are taking place. So we must implement security countermeasures that are unseen to the end users or simple enough to be adopted by any user into the university infrastructure.