Long is a type of data type used in programming languages ​​such as Java, C ++, and C #. A constant or variable defined as long stores a single 64-bit signed integer.

What then makes a 64-bit signed integer? It helps to divide each word into parts from right to left. An integer is a whole number that does not have a decimal point—for example, 1, 99, or 234536.

“Signed” means that that number can be either positive or negative, since it has a minus (-) symbol before it. 64-bit means that this number can store 263 or 18,446,744,073,709,551,616 different values ​​(since one bit is used for sign). Since long data types are signed, the possible integers range is -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807, which includes 0.

Standard integer (int) data types typically store a 32-bit whole number in modern programming languages. Therefore, if a variable or constant potentially stores a number that is much larger than 2,147,483,647 (231 ÷ 2), then they are defined as a long and not an int.

In Standard C, a long integer is limited to a 32-bit value that ranges from -2,147,483,648 to 2,147,483,647.

See also  Gigabit

Leave a Comment