Building secure software is hard. I’ve spent years writing code in various languages, and the constant worry about memory safety, side-channel attacks, or subtle logic errors can keep you up at night. This is where Rust enters the picture for me. Its compile-time guarantees around memory management are a powerful ally, especially when you’re working on systems where a single mistake can lead to a catastrophic failure. Let’s talk about some practical ways to use Rust for security and cryptography.
When you need to take some data and produce a unique fingerprint for it, you use a hash function. Think of it like a digital summary. It’s a one-way street; you can’t get the original data back from the hash. I often use this to verify file integrity or, with careful additional steps, to store passwords. Here’s how you can compute a SHA-256 hash, which is a common and strong standard.
use sha2::{Sha256, Digest};
fn compute_hash(data: &[u8]) -> String {
let mut hasher = Sha256::new();
hasher.update(data);
let result = hasher.finalize();
format!("{:x}", result)
}
fn main() {
let input = b"hello world";
let hash = compute_hash(input);
println!("SHA-256 hash: {}", hash); // Prints a long hexadecimal string
}
The sha2 crate gives you a clean interface. You create a hasher, feed it data with update, and finalize to get the result. This hash will always be the same for the same input, so comparing hashes is a reliable way to check if data has been altered.
Sometimes, you need to keep data secret, not just fingerprint it. This is where encryption comes in. Symmetric encryption uses the same key to lock and unlock the data. A robust modern choice is AES in GCM mode, which provides both confidentiality and assurance that the encrypted data hasn’t been tampered with.
use aes_gcm::{Aes256Gcm, Key, Nonce};
use aes_gcm::aead::{Aead, NewAead};
use rand::RngCore;
fn encrypt_message(key: &[u8], plaintext: &[u8]) -> Result<Vec<u8>, aes_gcm::Error> {
// Create a cipher instance from our 32-byte key
let cipher = Aes256Gcm::new(Key::from_slice(key));
// Generate a random 12-byte nonce. This must be unique for each encryption with the same key.
let mut nonce_bytes = [0u8; 12];
rand::thread_rng().fill_bytes(&mut nonce_bytes);
let nonce = Nonce::from_slice(&nonce_bytes);
cipher.encrypt(nonce, plaintext) // This returns the ciphertext
}
fn decrypt_message(key: &[u8], nonce: &[u8], ciphertext: &[u8]) -> Result<Vec<u8>, aes_gcm::Error> {
let cipher = Aes256Gcm::new(Key::from_slice(key));
let nonce = Nonce::from_slice(nonce);
cipher.decrypt(nonce, ciphertext) // Returns the original plaintext if successful
}
A critical point here is the nonce. It’s a number used once. You must never reuse the same nonce with the same key. In the example above, I generate it randomly. In a real system, you’d need to send or store this nonce alongside the ciphertext so the other side can decrypt. The GCM mode handles authentication automatically; if the ciphertext is modified, the decryption will fail.
This brings us to a fundamental need: good randomness. Keys, nonces, salts—they all need to be unpredictable. Rust’s rand crate can tap into your operating system’s source of secure random numbers.
use rand::RngCore;
fn generate_key() -> [u8; 32] {
let mut key = [0u8; 32];
rand::thread_rng().fill_bytes(&mut key);
key
}
fn main() {
let secret_key = generate_key();
println!("My new key: {:?}", secret_key); // A truly random 32-byte array
}
It looks simple, and that’s the point. You don’t want to roll your own random number generator for cryptography. Using the system’s provided entropy is the safest path.
Moving beyond shared secrets, we often need to prove identity or the origin of a message. This is the realm of public-key cryptography and digital signatures. I have a private key that only I know. I can use it to sign a message. Anyone with my corresponding public key can verify that the signature is valid, proving the message came from me and wasn’t changed.
use ring::signature;
use ring::rand::SystemRandom;
fn generate_keypair() -> Result<signature::EcdsaKeyPair, ring::error::Unspecified> {
let rng = SystemRandom::new();
let pkcs8_bytes = signature::EcdsaKeyPair::generate_pkcs8(
&signature::ECDSA_P256_SHA256_ASN1_SIGNING,
&rng,
)?;
signature::EcdsaKeyPair::from_pkcs8(
&signature::ECDSA_P256_SHA256_ASN1_SIGNING,
pkcs8_bytes.as_ref(),
&rng,
)
}
fn sign_and_verify() -> Result<(), Box<dyn std::error::Error>> {
let rng = SystemRandom::new();
let message = b"This is a critical system update.";
// Generate a key pair
let key_pair = generate_keypair()?;
// Sign the message
let signature = key_pair.sign(&rng, message)?;
// Get the public key bytes to share
let public_key_bytes = key_pair.public_key().as_ref();
// Later, someone verifies it
let peer_public_key = signature::UnparsedPublicKey::new(
&signature::ECDSA_P256_SHA256_ASN1,
public_key_bytes,
);
match peer_public_key.verify(message, signature.as_ref()) {
Ok(()) => println!("Signature is valid. Message is authentic."),
Err(_) => println!("Signature verification failed!"),
}
Ok(())
}
The ring crate provides these primitives. You generate a key pair, keep the private part safe, and share the public part. The signature is tied to the exact message content. Changing even a single bit will cause verification to fail.
When users give you their passwords, you must handle them with extreme care. The old method of using a simple hash like SHA-256 is dangerously insufficient. Attackers use precomputed tables (rainbow tables) and powerful hardware to crack weak hashes. The solution is a deliberately slow, memory-hard password hashing algorithm like Argon2.
use argon2::{self, Config, ThreadMode, Variant, Version};
use rand::Rng;
fn hash_password(password: &str) -> Result<String, argon2::Error> {
// Generate a unique, random salt for each password
let salt: [u8; 16] = rand::thread_rng().gen();
// Configure Argon2. These parameters increase the work factor.
let config = Config {
variant: Variant::Argon2id,
version: Version::Version13,
mem_cost: 4096, // Memory usage in KiB
time_cost: 3, // Number of iterations
lanes: 2, // Parallelism degree
thread_mode: ThreadMode::Parallel,
secret: &[],
ad: &[],
hash_length: 32,
};
argon2::hash_encoded(password.as_bytes(), &salt, &config)
}
fn verify_password(stored_hash: &str, attempted_password: &str) -> bool {
// This compares the Argon2 hash of the attempt with the stored hash.
// It handles the salt and parameters automatically.
argon2::verify_encoded(stored_hash, attempted_password.as_bytes()).unwrap_or(false)
}
fn main() {
let user_password = "MySuperSecret123!";
let hash = hash_password(user_password).unwrap();
println!("Store this hash in your database: {}", hash);
// Later, during login
let login_attempt = "MySuperSecret123!";
if verify_password(&hash, login_attempt) {
println!("Login successful.");
} else {
println!("Invalid password.");
}
}
The key here is the configuration. mem_cost and time_cost can be tuned to make the hash operation expensive enough to slow down attackers but still acceptable for legitimate logins. The salt ensures that even identical passwords result in completely different hashes.
For systems that communicate over networks, encryption in transit is non-negotiable. This is where TLS comes in. While you might often rely on a service like a web server to handle this, there are times you need to embed TLS capabilities directly into your Rust application. The rustls crate is a pure-Rust TLS implementation that avoids dependencies on error-prone C libraries like OpenSSL.
use rustls::{ClientConnection, ServerConnection, Stream};
use std::net::TcpStream;
use std::sync::Arc;
use std::io::{Read, Write};
fn make_tls_client_connection(hostname: &str) -> Result<(), Box<dyn std::error::Error>> {
// Load root certificates that we trust (like those from major CAs)
let mut root_store = rustls::RootCertStore::empty();
for cert in rustls_native_certs::load_native_certs()? {
root_store.add(&rustls::Certificate(cert))?;
}
let config = rustls::ClientConfig::builder()
.with_safe_defaults()
.with_root_certificates(root_store)
.with_no_client_auth(); // We're not providing a client certificate
let server_name = hostname.try_into()?;
let tls_conn = ClientConnection::new(Arc::new(config), server_name)?;
let tcp_stream = TcpStream::connect((hostname, 443))?;
let mut tls_stream = Stream::new(tls_conn, tcp_stream);
// Send an HTTP request over the encrypted channel
tls_stream.write_all(b"GET / HTTP/1.1\r\nHost: example.com\r\nConnection: close\r\n\r\n")?;
tls_stream.flush()?;
// Read the encrypted response
let mut plaintext_response = Vec::new();
tls_stream.read_to_end(&mut plaintext_response)?;
println!("Received response (first 200 bytes): {:?}", &plaintext_response[..200.min(plaintext_response.len())]);
Ok(())
}
This code sets up a client that verifies the server’s certificate is valid and issued by a trusted authority before establishing a connection. All data passed over the tls_stream is automatically encrypted and decrypted.
Security vulnerabilities can be incredibly subtle. Consider a function that checks an API token. A naive implementation might compare bytes one by one and return false as soon as it finds a mismatch. An attacker can measure how long this check takes. If it fails on the first byte, it returns quickly. If it fails on the last byte, it takes longer. This timing difference leaks information. To prevent this, we need constant-time comparisons.
use subtle::ConstantTimeEq;
fn verify_token(expected: &[u8], provided: &[u8]) -> bool {
// `ct_eq` takes the same amount of time no matter where the difference is.
expected.ct_eq(provided).unwrap_u8() == 1
}
fn main() {
let secret_token = b"APP.USER.REAL_TOKEN_XYZ";
let user_attempt = b"APP.USER.WRONG_TOKEN_ABC";
// This comparison's duration does not reveal where the mismatch was.
if verify_token(secret_token, user_attempt) {
println!("Access granted.");
} else {
println!("Access denied.");
}
}
The subtle crate provides these operations. It’s crucial for any comparison involving secrets, like HMACs, password hash verifications, or authentication tokens.
Finally, let’s think about what happens to sensitive data in memory after we’re done with it. In a language with a garbage collector, or even in Rust if we’re not careful, that data might sit in freed memory for a while before being overwritten. An attacker with a memory dump could find it. We need to explicitly clear it.
use zeroize::Zeroize;
fn handle_sensitive_data() {
let mut secret_buffer = vec![0xDE, 0xAD, 0xBE, 0xEF; 8]; // Some sensitive data
use_secret(&secret_buffer);
// Explicitly zero the memory before it goes out of scope
secret_buffer.zeroize();
println!("Buffer after zeroize: {:?}", secret_buffer); // Will be all zeros
// The memory is now safe from accidental exposure.
}
fn use_secret(_data: &[u8]) {
// Perform some operation
}
The zeroize crate ensures the memory is overwritten. Traits like ZeroizeOnDrop can even make this happen automatically when a struct is dropped, reducing the chance of human error. This is a defense-in-depth measure, but an important one for handling cryptographic keys and passwords.
Each of these techniques tackles a specific, common challenge in secure systems development. Rust doesn’t automatically make your code secure, but its ownership model and rich ecosystem of well-designed crates give you a much stronger foundation to build upon. You spend less time worrying about memory corruption vulnerabilities and more time implementing your logic correctly. For me, that shift in focus is the most significant security feature Rust offers.