The quality of some code
Sometimes when I decompile some code I find, it can be really interesting and odd to de-construct the thought patterns of some developers. In this case, I was experiencing a bug with an ASP.NET web app I'd set up recently, and the company that makes it wasn't being the most helpful, so I decided to decompile one of the problem assemblies, reverse engineer it, and fix their bug.
Can find some odd behavior though... in this case, they love utility classes, and completely abuse them. Here is one thing I found it doing when looking through the code:
object middleArray = new object[length];
System.Array.Copy(originalArray, index, middleArray, 0, length);
byte finalArray = new byte[middleArray.Length];
for (int i = 0; i < middleArray.Length; i++)
finalArray = (byte) middleArray;
They basically copy twice, when they only need to once. Also, that the shortened version. The beginning processing was in one class, the System.Array.Copy was done in a utility method that copies a segment of one array to a object array, and then had another utility to convert an array of object to byte (yes, specifically to byte... like it is something they do often enough.
All those lines and methods could really be compressed down to this:
byte finalArray = new byte[length];
System.Array.Copy(originalArray, index, finalArray, 0, length);
Wow, imagine that.
Not really sure what to make out of things like that. Poor design? Over engineering? Someone's twisted sense of job security?