Skip to main content

The scope of a PoC

Let us talk about what it should be the scope of a PoC and what you should or you should not have in a PoC.

Purpose of PoC
First, we need to define what is the purpose of a PoC is. The main purpose is to demonstrate the principles that are covered in technical documents (that it is not just theory and diagrams).

Reusability
It is already a deja vu for me to hear people that they want to reuse the PoC output in the main project. This happens because many times the PoC scope is too big and does not covers only the ideas that needs to be demonstrated.
When you have a PoC that covers more than 15% of the implementation effort than you might have a problem. That is not a PoC anymore, it is a PILOT, that represents a system with a limited functionality that go in production. The Pilot might have many restrictions, from NFRs to business use cases that are covered, but it has some part that works.
You will never want to invest in a PoC more than it is necessary and you shall always push the output code to trash. Even if it is working code, it should never reach the Pilot or production. Yes, technical team might look over the PoC implementation to get inspiration, but nothing more than this.

Scope
It is so common to add to the PoC scope things that are general truth. A general truth is well documented already and proven by others.
A good example is to check in a PoC that Azure AD can be used as authentication mechanism inside an ASP.NET MVC application. You already know that it works and you will find plenty official documentation. Yes, maybe the team do not have experience with this, but this does not means that it will not work.


Challenge the PoC Scope
Let us take for example the following PoC scope:

  • Import data from a 3rd party provider
  • Store data inside CosmosDB
  • Validate data and move it to Azure SQL
  • Expose data inside a ASP.NET MVC application using OData
  • Use Azure AD for authentication in front of OData Service
  • Use data access restrictions to OData services based on user role
  • Create a simple web application based on Angular
  • Display data that was read from OData Service

What do you think; it is the scope of this PoC valid?
I personally would say that it is not. Most of the items included in PoC are general truth and you know that they will work. Let’s take a look one by one and see.
Import data from a 3rd party provider: This is a valid PoC item and it should be kept. Even if it is more on integration side, you want to validate that you can communicate and extract data from an external 3rd party.
Store data from 3rd party inside CosmosDB: You need to keep data somewhere, but what is the purpose of this inside the PoC. Taking into account the other items, the data is kept only to validate it and move it Azure SQL. In this case, the validation can be done on the fly and there is no need to keep it inside CosmosDB for PoC.
Validate data and move it to Azure SQL: The validation is a valid point to show how data transformation is done, but you need to cover only one data type. There is no need to cover all of them. You can keep the data inside Azure SQL for PoC, but a binary file should work as well. You demonstrate the validation part, not the storage. It is clear that you can store data inside Azure SQL.
Expose data inside an ASP.NET MVC application using OData: This can be added to the scope as long you expose only one entity and you want to validate that you can restrict access to data based on user role. Otherwise, OData can be used to offer access to entities. Another case when you would want to include OData inside PoC is when you have a complex data structure and you want to be sure that it is possible to expose only a part of it in the way that you want.
Use Azure AD for authentication in front of OData Service: If you would have only this inside PoC, this would be a general truth. Combined with the next requirement, it might make sense to include inside PoC. Even if, the next item can be seen as a general truth.
Use data access restrictions to OData services based on user role: We know that inside an ASP.NET MVC application you can restrict access based on user role. This is kind of general truth but let’s say that it might be included inside a PoC.
Create a simple web application based on AngularJS: What is the purpose of it? You can validate the authentication and role based using a simple C# code. There is no need to create an Angular application only for it. Only if you want to define the base template and in this case it is the pilot.
Display data that was read from OData Service inside AngularJS: As long as you don’t use a custom UI controller or things like this, this is pretty clear that will work and it doesn’t make sense to include it.

As we can see, most of the things that were included were general truth and you do not need to cover them inside a PoC.

PoC Optimization
There are some things that you can do to optimize the PoC. Even if you don’t plan to use EF, for the PoC it might be useful to extract data from SQL using EF. You can use the designer from database and you would not need to write a line of code. The same thing for OData. You can expose content that is read using EF, giving you the option to write 0 lines of code to read data from Azure SQL and expose as OData. This is applicable as long as you don't have in the scope some NFRs like performance on reading data from storage.

What you should remember

  • Focus inside PoC only on things that you need to prove that work. 
  • Do not validate inside a PoC things that you already know that are general truth.
  • Avoid defining the design of the application during PoC
  • Do not reuse PoC code in production

Comments

  1. I would say that sometimes just a better name should be found - very often, PoC means "opportunity to learn about some new technology".. :) So it is used as a time-boxed way to later answer questions like: "how long would _you_ need to complete that?" or "how hard is for _you_ to use that technology?" :)

    ReplyDelete

Post a Comment

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(...

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th...

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills)

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills) The idea that moving to the cloud reduces the costs is a common misconception. The cloud infrastructure provides flexibility, scalability, and better CAPEX, but it does not guarantee lower costs without proper optimisation and management of the cloud services and infrastructure. Idle and unused resources, overprovisioning, oversize databases, and unnecessary data transfer can increase running costs. The regional pricing mode, multi-cloud complexity, and cost variety add extra complexity to the cost function. Cloud adoption without a cost governance strategy can result in unexpected expenses. Improper usage, combined with a pay-as-you-go model, can result in a nightmare for business stakeholders who cannot track and manage the monthly costs. Cloud-native services such as AI services, managed databases, and analytics platforms are powerful, provide out-of-the-shelve capabilities, and increase business agility and innovation. H...