Calculates the number of characters produced by decoding a sequence of bytes from the specified byte array.
The number of characters produced by decoding the specified sequence of bytes.
Type Reason ArgumentNullException bytes is null. ArgumentOutOfRangeException index < 0.
-or-
count < 0.
-or-
index and count do not specify a valid range in bytes (i.e. (index + count) > bytes.Length).
To calculate the exact array size required by UnicodeEncoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, the application uses UnicodeEncoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UnicodeEncoding.GetMaxCharCount(int). The UnicodeEncoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UnicodeEncoding.GetMaxCharCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.