What number Bits are used to represent Unicode, ASCII, Utf-16, and Utf-8 characters?
That should get you started. Also, Unicode starts with the ASCII codes. (0-255)
UTF-8: one to four 8-bit bytes, so 7-32 (assuming lowest possible value is in ASCII range)
UTF-16: one to two 16-bit bytes, so 7-32 (assuming lowest possible value is in ASCII range)
Unicode: not applicable, as it describes a standard — can use different encodings (UTF-8 and UTF-16 for example).
Note It would have been quicker for you to just look on Wikipedia
Unicode in Java needs 16 bits and ASCII needs 7 bits. Although the ASCII character set uses only seven bits, it is usually eight bits. UTF-8 shows characters using patterns of 8, 16 and 18 bits. UTF-16 uses patterns of 16 bits and larger bits.