The brain must robustly store a large number of memories, corresponding to the many events and scenes a person encounters over a lifetime. However, the number of memory states in existing neural network models either grows weakly with network size or recall performance fails catastrophically with vanishingly little noise. Here we show that it is possible to construct an associative content-addressable memory (ACAM) with exponentially many stable states and robust error-correction. The network possesses expander graph connectivity on a restricted Boltzmann machine architecture. The expansion property allows simple neural network dynamics to perform at par with modern error-correcting codes. Appropriate networks can be constructed with sparse random connections combined with glomerular nodes and a local associative learning rule, using low dynamic-range weights. Thus, sparse quasi-random constraint structures – characteristic of an important class of modern error-correcting codes – may provide for high-performance computation in artificial neural networks and the brain.