BytePane

Hex to Decimal Converter: Convert Hexadecimal to Decimal

Number Systems13 min read

The Myth: Hex Is Only for Low-Level Programmers

Web developers work with hexadecimal every day — usually without thinking about it. The #7C3AED in your CSS is a hex color. The 0xFF in a bitmask check is hex. The 2001:0db8:: at the top of an IPv6 configuration is hex. The a3f5b2c1 in a git commit hash, the 0x7FFEE4B2 in a crash dump, the %20 in a URL — all hex representations of binary data.

Hex isn't an esoteric systems-programming artifact. It's the default human-readable encoding for binary data across almost every domain of software engineering. Understanding the conversion between hex and decimal takes five minutes and removes a persistent cognitive overhead that shows up constantly in debugging, configuration, and code review.

This article explains the math from first principles, covers where you'll encounter hex as a working developer, and gives you copy-paste conversion code for every language you're likely to use.

Key Takeaways

  • Hex (base 16) uses digits 0–9 and letters A–F. One hex digit = 4 bits. Two hex digits = 1 byte (0–255). Eight hex digits = 4 bytes (0–4,294,967,295).
  • Conversion: multiply each digit by 16 raised to its position (right to left, starting at 0) and sum the results.
  • Hex is everywhere: CSS colors, IPv6 addresses (128-bit = 32 hex digits), memory pointers, SHA-256 hashes, UUID formats, bitmasks.
  • In JavaScript: parseInt('FF', 16) → 255. In Python: int('FF', 16) → 255. In Go: strconv.ParseInt("FF", 16, 64).
  • The 0x prefix signals hex in almost every language (C, C++, Java, JS, Python, Go, Rust). Without it, the same digits parse as decimal.

The Math Behind Hexadecimal: Base 16 Explained

Every number system is built on the same principle: positional notation. Each digit's contribution to the total is its face value multiplied by the base raised to its position.

In decimal (base 10), the number 419 means:

4 × 10² + 1 × 10¹ + 9 × 10⁰
= 4 × 100 + 1 × 10 + 9 × 1
= 400 + 10 + 9
= 419

In hexadecimal (base 16), the same value is written as 1A3. To convert to decimal, apply the same positional formula but with base 16, mapping A → 10:

1 × 16² + A × 16¹ + 3 × 16⁰
= 1 × 256 + 10 × 16 + 3 × 1
= 256   +  160     +  3
= 419

The Hex Digit Alphabet

Base 16 requires 16 distinct digit symbols. The first 10 are borrowed from decimal (0–9). The remaining six use letters:

Hex Digit0129ABCDEF
Decimal Value0129101112131415

Hex is case-insensitive: FF, ff, and Ff all equal decimal 255. CSS uses both conventions — #ffffff and #FFFFFF are identical. Crypto outputs (SHA hashes, UUIDs) typically use lowercase by convention.

Step-by-Step Conversion Examples

Example 1: FF → 255

F F
│ └─ position 0: F (15) × 16⁰ = 15 × 1  =  15
└─── position 1: F (15) × 16¹ = 15 × 16 = 240
                                             ───
                                             255

Example 2: 7C3AED → CSS purple color

# Split into byte pairs: 7C | 3A | ED
7C = 7 × 16 + 12 = 112 + 12 = 124  → Red:   124
3A = 3 × 16 + 10 =  48 + 10 =  58  → Green:  58
ED = 14 × 16 + 13 = 224 + 13 = 237 → Blue:  237

Result: rgb(124, 58, 237)  ← this is #7C3AED

Example 3: DEADBEEF → memory address

D  E  A  D  B  E  E  F
│  │  │  │  │  │  │  └─ pos 0: 15 × 16⁰ =           15
│  │  │  │  │  │  └──── pos 1: 14 × 16¹ =          224
│  │  │  │  │  └─────── pos 2: 14 × 16² =        3,584
│  │  │  │  └────────── pos 3: 11 × 16³ =       45,056
│  │  │  └───────────── pos 4: 13 × 16⁴ =      851,968
│  │  └──────────────── pos 5: 10 × 16⁵ =   10,485,760
│  └─────────────────── pos 6: 14 × 16⁶ =  939,524,096
└────────────────────── pos 7: 13 × 16⁷ = 3,741,376,000
                                           ─────────────
                                           3,735,928,559

DEADBEEF = 3,735,928,559 in decimal. It's a famous test value (a "magic number") used to fill uninitialized memory in debuggers — memorable in hex, completely opaque as a decimal integer.

Powers of 16: The Reference Table Every Developer Should Know

Hex DigitsBitsMax Value (FF…F)Common Use Case
1 digit4 bits (nibble)15Nibble flags, BCD
2 digits8 bits (1 byte)255RGB channels, ASCII, IPv4 octets
4 digits16 bits (2 bytes)65,535IPv6 groups, Unicode code points, port numbers
6 digits24 bits (3 bytes)16,777,215RGB color space (16.7M colors)
8 digits32 bits (4 bytes)4,294,967,295IPv4 addresses, 32-bit integers, memory addresses (x86)
16 digits64 bits (8 bytes)18.4 quintillion64-bit pointers, timestamps (nanoseconds), large IDs
32 digits128 bits (16 bytes)3.4 × 10³⁸IPv6 full address, UUIDs, MD5 hashes
64 digits256 bits (32 bytes)1.16 × 10⁷⁷SHA-256 hashes, Ethereum addresses

Where Hex Shows Up in Real Development Work

CSS Color Codes

The CSS color specification (CSS Color Level 4) supports hex notation in three forms. According to MDN Web Docs, hex color syntax is the most widely used color format in CSS codebases, appearing in over 85% of stylesheets analyzed in HTTP Archive data.

/* 6-digit: #RRGGBB */
color: #7C3AED;   /* rgb(124, 58, 237) — purple */

/* 3-digit shorthand: #RGB expands to #RRGGBB */
color: #F0A;      /* expands to #FF00AA */

/* 8-digit: #RRGGBBAA (with alpha) */
color: #7C3AED80; /* 50% transparent purple */

/* Converting with JavaScript */
const r = parseInt('7C', 16);  // 124
const g = parseInt('3A', 16);  // 58
const b = parseInt('ED', 16);  // 237

You can explore color formats and conversions in detail with BytePane's color formats guide.

IPv6 Addresses

IPv6 addresses are 128-bit numbers expressed as eight groups of four hex digits, separated by colons. Per ARIN (American Registry for Internet Numbers), IPv6 adoption has been growing steadily — Google's IPv6 statistics show over 45% of users globally now access Google over IPv6 as of early 2026.

# Full IPv6 address (128 bits = 32 hex digits)
2001:0db8:85a3:0000:0000:8a2e:0370:7334

# RFC 5952 compression rules:
# 1. Leading zeros in each group may be omitted
2001:db8:85a3:0:0:8a2e:370:7334

# 2. One consecutive run of all-zero groups → ::
2001:db8:85a3::8a2e:370:7334

# localhost in IPv6
::1   (expands to 0000:0000:0000:0000:0000:0000:0000:0001)

Bitmasks and Bitwise Operations

When working with permission flags, feature flags, or hardware registers, bitmasks expressed in hex are dramatically easier to reason about than decimal equivalents. One hex digit = 4 bits, so the relationship between hex and binary is direct.

// Unix file permission flags (octal, but same principle applies to hex bitmasks)
const PERMISSION_READ    = 0x04  // 0000 0100
const PERMISSION_WRITE   = 0x02  // 0000 0010
const PERMISSION_EXECUTE = 0x01  // 0000 0001

// Set multiple permissions with OR
const permissions = PERMISSION_READ | PERMISSION_WRITE  // 0x06 = 0000 0110

// Check if a permission is set with AND
const canRead = (permissions & PERMISSION_READ) !== 0   // true

// In decimal: 4 | 2 = 6. In hex: 0x04 | 0x02 = 0x06
// The hex version makes the bit patterns immediately visible.

// Real-world: Node.js fs.constants
import { constants } from 'node:fs'
console.log(constants.O_RDONLY.toString(16))   // '0'
console.log(constants.O_WRONLY.toString(16))   // '1'
console.log(constants.O_RDWR.toString(16))     // '2'

Cryptographic Hashes

SHA-256 produces a 256-bit digest — always represented as 64 lowercase hex characters. Git commit hashes are SHA-1 (160 bits = 40 hex chars, though Git is transitioning to SHA-256). Ethereum addresses are the last 20 bytes (40 hex chars) of a Keccak-256 hash of the public key.

# SHA-256 of "hello" — 64 hex characters = 256 bits
2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824

# Git commit hash — 40 hex characters = 160 bits (SHA-1)
a3f5b2c1e8d9f4a2b7c6e1d0f3a5b8c2e4d7f9a1

# Ethereum address — 40 hex chars = 20 bytes
0x742d35Cc6634C0532925a3b844Bc454e4438f44e

Memory Addresses and Debugging

When a Node.js process crashes with a segfault, or you're reading a Rust core dump, memory addresses appear as hex. A 64-bit address like 0x7FFEE4B2C830 is 16 hex digits = 64 bits. Reading it in decimal (140,732,826,060,848) tells you nothing useful about its location in the address space. In hex, the leading 7F immediately signals "user space on Linux x86_64" — a structural signal invisible in decimal.

Hex ↔ Decimal Conversion Code in Every Language

JavaScript / TypeScript

// Hex string → decimal number
parseInt('FF', 16)           // 255
parseInt('7C3AED', 16)       // 8141549
parseInt('0x7C3AED', 16)     // 255 — 0x prefix handled automatically
Number('0x7C3AED')           // 8141549 — alternative syntax

// Decimal number → hex string
(255).toString(16)           // 'ff'
(255).toString(16).toUpperCase()  // 'FF'

// Pad to fixed length (e.g., always 2 hex chars for a byte)
(14).toString(16).padStart(2, '0')  // '0e' (not 'e')

// Convert RGB to hex color
function rgbToHex(r: number, g: number, b: number): string {
  return '#' + [r, g, b]
    .map(v => v.toString(16).padStart(2, '0'))
    .join('')
    .toUpperCase()
}
rgbToHex(124, 58, 237)  // '#7C3AED'

// Convert hex color to RGB
function hexToRgb(hex: string): { r: number; g: number; b: number } {
  const clean = hex.replace('#', '')
  return {
    r: parseInt(clean.slice(0, 2), 16),
    g: parseInt(clean.slice(2, 4), 16),
    b: parseInt(clean.slice(4, 6), 16),
  }
}
hexToRgb('#7C3AED')  // { r: 124, g: 58, b: 237 }

Python

# Hex string → int
int('FF', 16)           # 255
int('7C3AED', 16)       # 8141549
int('0x7C3AED', 16)     # 8141549 — 0x prefix handled
int('0x7C3AED', 0)      # 8141549 — base 0 auto-detects prefix

# Hex literal → int (Python evaluates at parse time)
value = 0x7C3AED        # int 8141549

# int → hex string
hex(255)                # '0xff'
hex(255)[2:]            # 'ff' — strip 0x prefix
f'{255:x}'              # 'ff' — format specifier
f'{255:X}'              # 'FF' — uppercase
f'{255:02x}'            # 'ff' — zero-padded to 2 chars
f'{14:02x}'             # '0e' — leading zero added

# bytes → hex string
b'Þ­¾ï'.hex()  # 'deadbeef'
bytes.fromhex('deadbeef')   # b'Þ­¾ï'

Go

import (
    "encoding/hex"
    "fmt"
    "strconv"
)

// Hex string → int64
n, err := strconv.ParseInt("7C3AED", 16, 64)  // 8141549, nil
n, err := strconv.ParseInt("FF", 16, 64)       // 255, nil

// int → hex string
hex := strconv.FormatInt(255, 16)    // "ff"
hex := fmt.Sprintf("%X", 255)        // "FF" — uppercase
hex := fmt.Sprintf("%02x", 14)       // "0e" — zero-padded

// []byte → hex string
data := []byte{0xDE, 0xAD, 0xBE, 0xEF}
s := hex.EncodeToString(data)         // "deadbeef"

// hex string → []byte
decoded, err := hex.DecodeString("deadbeef")  // [222 173 190 239], nil

Rust

// Hex literal (Rust evaluates at compile time)
let n: u32 = 0x7C3AED;  // 8141549

// Hex string → integer
let n = u32::from_str_radix("7C3AED", 16).unwrap();  // 8141549
let n = i64::from_str_radix("FF", 16).unwrap();       // 255

// integer → hex string
let s = format!("{:x}", 255u32);    // "ff"
let s = format!("{:X}", 255u32);    // "FF"
let s = format!("{:08X}", 255u32);  // "000000FF" — zero-padded to 8 chars

Common Hex Conversion Gotchas (and How to Avoid Them)

parseInt Without Radix

In JavaScript, calling parseInt('09') returns 9 in modern engines, but historically some interpreters treated a leading 0 as octal. Always pass the radix: parseInt('FF', 16). ESLint's radix rule enforces this.

JavaScript's 53-Bit Integer Limit

JavaScript's Number type can only represent integers exactly up to 2⁵³ − 1 (Number.MAX_SAFE_INTEGER). For 64-bit values like crypto hashes or nanosecond timestamps, use BigInt('0x' + hexString):

// WRONG — loses precision for values > 2^53
parseInt('FFFFFFFFFFFFFFFF', 16)  // 18446744073709552000 (wrong!)

// CORRECT — BigInt preserves full 64-bit precision
BigInt('0x' + 'FFFFFFFFFFFFFFFF')  // 18446744073709551615n (correct)

Signed vs. Unsigned Interpretation

The hex value 0xFFFFFFFF is 4,294,967,295 as an unsigned 32-bit integer — but −1 as a signed 32-bit integer using two's complement. Languages differ: C's unsigned int gives the unsigned value; JavaScript's bitwise operators always return signed 32-bit integers (0xFFFFFFFF | 0 returns −1). Use >>> 0 in JavaScript to force unsigned interpretation.

URL Percent-Encoding Uses Hex

URL encoding (%20 for a space) uses the hex encoding of the UTF-8 byte sequence. %20 = 0x20 = decimal 32 = ASCII space. %C3%A9 = the two-byte UTF-8 encoding of é (U+00E9). See the full explanation in BytePane's URL encoding guide.

Frequently Asked Questions

How do you convert hexadecimal to decimal?

Multiply each hex digit by 16 raised to its positional power (starting at 0 from the right) and sum the results. For 1A3: (1 × 256) + (10 × 16) + (3 × 1) = 419. Hex digits A–F map to 10–15. In JavaScript: parseInt("1A3", 16) → 419. In Python: int("1A3", 16) → 419.

Why do programmers use hexadecimal instead of decimal?

Hexadecimal maps cleanly to binary: one hex digit = exactly 4 bits. So a 32-bit address needs 8 hex digits vs. 10 decimal digits. This compactness makes hex the default for memory addresses, color values, network addresses, cryptographic hashes, and bitmask constants in source code.

What does 0x mean in programming?

0x is a prefix signaling that the following number is hexadecimal (base 16). For example, 0xFF equals decimal 255. The convention originates in C and is adopted across C++, Java, JavaScript, Python, Go, Rust, and most other languages. Without 0x, the same digits are interpreted as decimal.

How do I convert hex color codes to decimal RGB values?

Split the 6-digit hex color into three 2-digit pairs and convert each to decimal. #7C3AED → 7C = 124, 3A = 58, ED = 237 → rgb(124, 58, 237). In JavaScript: const [r, g, b] = ["7C","3A","ED"].map(h => parseInt(h, 16)).

What is the largest number you can represent in 2 hex digits?

The largest 2-digit hex value is FF = decimal 255. This is why RGB channels max at 255 — each is stored as one byte (8 bits = 2 hex digits). A 4-digit hex value maxes at FFFF = 65,535. An 8-digit hex value maxes at FFFFFFFF = 4,294,967,295 (the 32-bit unsigned int limit).

How do IPv6 addresses use hexadecimal?

IPv6 addresses are 128-bit numbers written as eight groups of four hex digits separated by colons: 2001:0db8:85a3::8a2e:370:7334. Hex makes 128 bits human-readable in 32 characters vs. 39 decimal digits. RFC 5952 allows compressing consecutive all-zero groups with :: for brevity.

Convert Hex to Decimal Instantly

Paste any hex value and get the decimal equivalent with step-by-step breakdown. Supports hex colors, memory addresses, and arbitrary hex integers.

Open Hex to Decimal Converter →