Basically it's considering the bits in the opposite order to the way you were expecting - you haven't shown how you're mapping your input binary to a BitArray, but the result is treating it as 1100 rather than 0011.
The documentation isn't clear, admittedly, but it does work the way I'd expect it to: bitArray[0] represents the least significant value, just as it usually is when discussing binary (so bit 0 is 0/1, bit 1 is 0/2, bit 2 is 0/4, bit 3 is 0/8 etc). For example:
using System;
using System.Collections;
class Program
{
static void Main(string[] args)
{
BitArray bits = new BitArray(8);
bits[0] = false;
bits[1] = true;
int[] array = new int[1];
bits.CopyTo(array, 0);
Console.WriteLine(array[0]); // Prints 2
}
}