Monday 27 July 2009

DAMTUG Presentation bits

Here are the code bits from the DAMTUG presentation I did on the 15th of July in Dublin.

Firstly the Defensive Programming code bits.

First one is how to encrypt your web.config online. That can be downloaded from here

Second one is based on the directory traversal type attack and can be downloaded from here (blog post here)

The Virtual Earth/Bing Maps demos can be downloaded from here

The Defensive Programming slide deck can be downloaded as XPS and PDF formats.

Thanks to all who came and also those who provided feedback. I will be doing some code samples for XSS over the coming weeks based on the feedback from the talk.

Issue #4 Directory Traversal

This issue is one where you can get caught even though you think you are being smart :) What directory traversal means, is exactly like it sounds. Its moving between directories when you are not supposed to be able to. Now this is not like traversing directories when directory browsing in IIS is on for example. It means downloading files where it would normally be impossible to do. Such as the web.config.

Firstly it relies on a bit of patience and some poking around to find the correct information. Let us consider the following scenario.

The developer has created a nifty file download solution. It takes the name of a file and sends the file to the user with a Save As dialog box. A common enough scenario. The developer has created this application using ASP.NET and used the TransmitFile function that is part of the Response object.

A sample URL . In the code behind the developer uses code like the following

var filename = DateTime.Now.Ticks + ".txt"; 
var filePath = Request.QueryString["FilePath"];

if (string.IsNullOrEmpty(filePath)) return;

Response.ContentType = "text/plain";
Response.AppendHeader("Content-Disposition", "attachment; filename=" + filename);
Response.TransmitFile(Server.MapPath("~") + filePath);

Fairly routine code, nothing too surprising. The developer creates a new filename and sets the content type of the request to text and then serves the file which will prompt a Save As dialog box in most browsers.

So where is the problem?.

Lets look at the following things. Server.MapPath(“~”) returns the physical root directory of the web application and the page blindly encodes the file it is looking for as a text file. Lets for the moment assume that the directory that it returns is c:\webs\Demo

So what would happen if we changed the URL slightly to the following:

Now the web.config file would be served as a text file to the user. If this config file contained connection strings that were not encrypted or other sensitive material, your system would be seriously compromised.  Furthermore, if the system is designed using the ASP.NET website template, you could download the ASPX and CS/VB code behind files as well as other DLLs and reverse engineer them. Your system would be thoroughly penetrated.

Right, so you found this issue and you change the code to use a specific directory for downloadable files. Your code now looks like this

var filename = DateTime.Now.Ticks + ".txt"; 
var filePath = Request.QueryString["FilePath"];

if(string.IsNullOrEmpty(filePath)) return;

Response.ContentType = "text/plain";
Response.AppendHeader("Content-Disposition", "attachment; filename=" + filename);

So the download URL is still the same And the directory it is trying to read is c:\webs\Demo\downloads

If we try the URL we will get an invalid file as there is no web.config file in that directory. So we are safe. Well no, you are not. Lets change the FilePath query string variable once more to

So in this case the directory is now c:\webs\demo\downloads\..\ which will be translated to c:\webs\demo because the ..\ says go one directory up from the current. Again we can download the web.config. The reason it is changed is that the directory translation works like that.

So how do you avoid such problems. Well first dont do what was just shown. If do want to transmit a file to the user make sure you know exactly what you are transmitting. Additionally a check to see if its the correct file type will usually give you an idea if something funky is happening.

You can prevent such types of problems by again validating your input and not sending the filename you want to download across the wire. Using id numbers is ok, but make sure you check your inputs again.

Always put your web apps on a different partition than your system files because if we were to use the following example it would be possible to get access to the other configuration files (providing that the permissions allowed).

Make sure your web server is fully patched and the correct permissions for your application are in force.

You can use tools such as URLScan and IIS Lockdown to scan your system for vulnerabilities. These tools are free and part of a well maintained server. Just be aware that URL Scan and IIS Lockdown can sometimes adversely affect your servers ability to serve certain requests such as ASMX which is possible if you tighten the security too much or don’t watch what the settings correctly.

You can download the code sample for this post here. It just shows how the code can manipulated and it serves as an example of what not to do!

Issue #5 Incorrect Permissions

This is a very common problem and usually comes about due to a lack of knowledge of how to secure an application or perhaps because the developer is afraid of what will break if they apply the correct permissions. Usually when we develop, we are administrators on our own machines and probably system administrators on the SQL Server machine as well.

Additionally it may come down to some components requiring elevated privileges to work. This can happen for example if you want to do Excel automation from an ASP.NET application.

Windows programmers can require Administrator rights as well, to access certain hives in the Windows registry. Even Visual Studio requires admin rights so that you can develop otherwise you cannot debug certain application types.

If you use too loose permissions on your database you can leave yourself wide open to the more severe side of SQL injection attacks.

You should use the lowest level permissions you can get away with for your application. It is better to add permissions than take away so start with a highly restricted account and add only the permission you need to ensure the correct working of your application.

Ideally you should look at specific accounts for your applications to allow separation of roles for applications.

Issue #6 Error Messages

You should never see the yellow screen of doom on a production web server. Full stop end of story! It gives away too much information about your code where it is stored and allows anyone that can read the stack trace, ideas on how to penetrate your defenses.

Your custom error tag in the web.config should always be set to either On or RemoteOnly at the very least. Setting it to RemoteOnly even in development will allow you to debug the application but when it is deployed it will not show the detailed error message screen to any non-local users.

You should also make sure that you turn off Debug in the web.config as well as Trace as these also provide more information.

Below is a screen shot from the Irish Examiner web site that shows the yellow error screen and you can see how much additional information we can get from it such as the location of the files and that it is written in VB.NET running on the 2.0 Framework.


Wednesday 22 July 2009

Issue #7 Client Side Validation

Now how many people do you know turn off JavaScript in their browser as a security measure. Not a lot I would bet. This is because JavaScript is becoming more and more the backbone of the web user experience.

It also forms most of our client side validation because it works in all browsers. So simply by turning it off you can circumvent what most people use as their only line of defense.

Another possibility is that you check for this information but someone changes the function to return a positive result all the time regardless of inputs. This can be done usually quite easily with modern in browser developer tools.

So to prevent these types of mistakes you shouldn’t rely on your client side validation as your only method of validating input before it hits your data repository.

You should always use a central validation source so that all strings for example are validated in the same manner, all integers etc. This way, you can manage changes to your code base more effectively and also you will follow the DRY principle.

There is an example on Daily WTF of this issue being exploited very easily and saving the submitter of the article a couple of dollars in the process.

You should only use whitelists rather than blacklists. Whitelists define what you will expect whereas blacklists define all that you don’t expect which you may or may not know. So which is easier to implement when you see it written like that?

You need to make sure that you escape any special characters. Most Irish based developers are familiar with escaping the apostrophe due to it being used quite frequently in many Irish surnames. But if you expect additional script types you should be expecting them and code accordingly.

Try and validate your inputs according to the RFC rules for that input and finally when you are using XML validate it against the schema. The Validation class in the System.Xml namespace is very useful in this regard and should be in your standard toolkit of code snippets when you are dealing with XML.

Additional previous posts (10, 9, 8)

Issue #8 Not Patching

Following on from my previous posts (10 & 9)

In the Microsoft world we have update Tuesday which is the second Tuesday of every month and it is when most of the patches appear on the Windows Update service. Commonly, the Wednesday after this is known as “Rollback” Wednesday because as it happens, something is invariably broken due to this. So system administrators have become wary of blindly applying patches to their servers without fully testing the implications of the patch in a staging environment. This unfortunately means that your servers may not be patched as soon as you would like them to be.

So even though you may have designed this ultra-secure system which has been put through its paces in the development and staging process, it still may be vulnerable due to some known or unknown exploit of the underlying web server or operating system.

All web servers are continuously being updated to defend against exploits that have been discovered. This issue relies on you making sure that you keep your systems up to date. Now there will be a lot of developers out there who are not in charge of the servers that they deploy to be it that the systems are deployed to a hosting provider or the internal IT structure means that there are dedicated people in charge of servers. So what is required is a solid communications channel between the developers and the admins to ensure that the underlying OS and web server are as solid as possible and thus removing this particular attack vector.

You can use Google for example to locate unpatched or unprotected servers.