In latest months I switched from developing applications to designing their build infrastructure on Microsoft Azure. It was quite challenging at the beginning, but it’s a very interesting and rewarding field.
We know that C# is the .NET language we should use to build vanilla class libraries. A library built with that language will be easly consumed in other .NET dialects. After all this is why the BCL is written in this language.
Web scraping is the act of extracting data from web sites. From a programming standpoint it’s performing an HTTP request and parsing the HTML response. This may involve taking care of various low level details, like handling a stateful session and run into bad formed HTML.
I used to have a Blogger blog for years. It was my first blog, I had few projects on CodePlex (now dismissed) and used it mainly to notify updates. In that glorious days was born a project that you may know if you’re a .NET developer. The project is Command Line Parser library, at the moment hosted inside a GitHub organization.
Discriminated unions are an expressive construct used to describe values that can be modeled under one or more cases. This is the syntax as in MSDN documentation:
type type-name = | case-identifier1 [of type1 [ * type2 ...] | case-identifier2 [of type3 [ * type4 ...] ...
Avoid returning null is all you need to avoid handling them. Quoting NOOO:
With this post I want to share and reason about a brutal tecnique, that helped me when refactoring some code smells.