Skip to main content

MVC - How to log requests that are coming to our web application

In this blog post we will see where we can hook our code if we need to log all the requests that are coming to our web application.
There are moments when we want to log all the requests that are coming to our web application. From the requests URL, to the query or form parameters.
This task can be done in different ways. In this blog post we will see four different ways to do this.

Action Filter
A solution could be to create an action filter that has access to all the request information. From request we access information like request URL, query params, form params and so on.
public class FooActionFilterAttribute : ActionFilterAttribute
{
    public override void OnActionExecuting(ActionExecutingContext filterContext)
    {
        HttpRequestBase request =filterContext.RequestContext.HttpContext.Request;

        string requestType = request.HttpMethod;
        string formParams = request.Form.ToString();
        string queryParams = request.QueryString.ToString();
        string url = request.Path;
    }
}
Each action or controller that will have this action filter attached to it will log this information for us.

Base Controller
It is very common for controllers of a web application to have a base controller. This base controller can be used for this purpose also.
public class BaseController : Controller
{
    protected override void ExecuteCore()
    {
        HttpRequestBase request = this.Request;

        string requestType = request.HttpMethod;
        string formParams = request.Form.ToString();
        string queryParams = request.QueryString.ToString();
        string url = request.Path;
        ...
        base.ExecuteCore();
    }
}

HttpModule
Another option is to create and register an HttpModule that will log all the information that we need. If you need to access information from the Session also, than you should hook to the events that are executed after the session is loaded (PreRequestHandlerExecute). Also, even if you have an instance of the http context (HttpApplication) in the moment when you register to an event, you should never use it when an event is triggered. Each time you should get the context from the sender parameter of the event.
public class FooHttpModule : IHttpModule
    {
        private HttpApplication _context;

        private readonly bool _isActive;

        public FooHttpModule(IProfilingConfiguration profilingConfiguration)
        {
            _isActive = profilingConfiguration.IsProfileActive(ProfilingType.WebRequestContent);
        }

        public void Init(HttpApplication context)
        {
            if (!_isActive)
            {
                return;
            }

            _context = context;
            context.PreRequestHandlerExecute += PreRequestHandlerExecute;
        }

        private void PreRequestHandlerExecute(object sender, EventArgs e)
        {
            try
            {
                MvcApplication context = sender as MvcApplication;
                if (context == null)
                {
                    return;
                }

                HttpRequest request = context.Request;

                string requestExtension = VirtualPathUtility.GetExtension(request.FilePath).ToLower();
                if (!string.IsNullOrWhiteSpace(requestExtension))
                {
                    // js, gif, css and .XXX files request are not logged in the trace.
                    return;
                }

                string requestType = request.HttpMethod;
                string formParams = request.Form.ToString();                
                string queryParams = request.QueryString.ToString();
                string url = request.Path;

                string userInformation = CurrentUserInformation.IsUserLogin
                    ? CurrentUserInformation.AccountInformation.ToString()
                    : string.Empty;


                ...
            }
            catch (Exception exception)
            {
                ...
            }
        }

        public void Dispose()
        {
            lock (this)
            {
                if (_context == null)
                {
                    return;
                }

                _context.PreRequestHandlerExecute -= PreRequestHandlerExecute;
                _context = null;
            }
        }
    }
IIS Logs
If you don’t want custom logging and other things like this, you can active the login feature from IIS. There are 3 different levels of login that can be activated from ISS for this scenarios:

  • LogAll – logs all requests
  • LogSuccessful – Logs only successful requests (100-399)
  • LogError – Logs only unsuccessful requests (400-999)

The logs can be found under ‘C:\Windows\system32\LogFiles’ folder.
<httpLogging dontLog="false" selectiveLogging="LogError" />

In this blog post we saw 4 different ways of we can log the incoming traffic. Based on the needs and our preference we can go on one solution or another.
If we need a method to logs only the requests for specific actions or controllers, than Action Filter is our best friend. Using him we can control login in a granular way.
If we need to write custom logs for all the requests that are coming to our web application or filter them than a solution that we can use base controller or HttpModule. Personally I prefer the second one – HttpModule, I think that it is more clean and better solution.
If we need to log all the requests that are coming to our web application and it is not important to filter them, than IIS is our friend.

Comments

Popular posts from this blog

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th...

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills)

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills) The idea that moving to the cloud reduces the costs is a common misconception. The cloud infrastructure provides flexibility, scalability, and better CAPEX, but it does not guarantee lower costs without proper optimisation and management of the cloud services and infrastructure. Idle and unused resources, overprovisioning, oversize databases, and unnecessary data transfer can increase running costs. The regional pricing mode, multi-cloud complexity, and cost variety add extra complexity to the cost function. Cloud adoption without a cost governance strategy can result in unexpected expenses. Improper usage, combined with a pay-as-you-go model, can result in a nightmare for business stakeholders who cannot track and manage the monthly costs. Cloud-native services such as AI services, managed databases, and analytics platforms are powerful, provide out-of-the-shelve capabilities, and increase business agility and innovation. H...

Cloud Myths: Migrating to the cloud is quick and easy (Pill 2 of 5 / Cloud Pills)

The idea that migration to the cloud is simple, straightforward and rapid is a wrong assumption. It’s a common misconception of business stakeholders that generates delays, budget overruns and technical dept. A migration requires laborious planning, technical expertise and a rigorous process.  Migrations, especially cloud migrations, are not one-size-fits-all journeys. One of the most critical steps is under evaluation, under budget and under consideration. The evaluation phase, where existing infrastructure, applications, database, network and the end-to-end estate are evaluated and mapped to a cloud strategy, is crucial to ensure the success of cloud migration. Additional factors such as security, compliance, and system dependencies increase the complexity of cloud migration.  A misconception regarding lift-and-shits is that they are fast and cheap. Moving applications to the cloud without changes does not provide the capability to optimise costs and performance, leading to ...