# FPGAs and Accuracy
Contents
FPGAs, or "Field programmable gate arrays," have become very popular among hobbyists since 2017 due to a few factors. Intel's Terasic DE10-Nano, based on the Cyclone V FPGA SoC, was briefly subsidized for educational adoption and provided a stable platform for developers like many otherwise generic platforms before it (e.g. the BBC Micro), and with that, the promise of hardware-based, rather than software-based emulation. Unsurprisingly, consumer-focused efforts in this space have focused on legacy videogames; Analogue, a Seattle-based manufacturer of boutique reproduction FPGA-based game consoles, goes so far as to base its marketing claims around there being "no emulation" in its products.
FPGAs are almost like very early, pre-microprocessor discrete logic computers (the story of Steve Wozniak's arcade version of Breakout is my favourite of these) in that they mostly consist of logic gates and RAM blocks without a common architecture; FPGAs are unique in that they are manufactured at the same density as modern chips, and therefore have millions more generic, configurable gates than was previously possible. Thus, they can be made to act "natively" like a 6502, or a 68000, or some combination of hardware common to a specific platform (e.g. the Sega Genesis' 68000, Z80, and YM2612), and thereby run software on the "bare metal," i.e., without any intermediate layer.
This is an interesting development. Intermediating layers are in other contexts more popular than they've ever been; any organization which is running its server software in a bare metal environment in 2020 rather than in a complicated array of more portable virtual machines is probably facing down severe technical debt. However, emulation of video games (which we can charitably extend to emulation of all realtime, performance-based works) is notoriously sensitive to timings, and a few milliseconds of audio or input latency is widely considered by hobbyists to compromise its authenticity.
This problem has proven so intractable in the past, given the limitations of emulator performance, that it has given rise to extremely complex solutions such as Ubershaders (https://dolphin-emu.org/blog/2017/07/30/ubershaders/) and Runahead (https://docs.libretro.com/guides/runahead/). The former explanation is worth reading and too long to summarize; the latter involves configuring an emulator to always silently run two instances of a game in the background and swap their internal memory state based on player inputs in order to compensate for a single 60hz frame that could otherwise be dropped here or there. This is serious stuff done by committed engineers. FPGA programming is hardly accessible to most, but can seem elegantly simple by comparison.
However, FPGA-based emulation[1] also lacks many of the value-added features and innovations of the past 30 years of software emulation, such as graphical upscaling or turnkey memory manipulation -- there are penalties from their configurability. FPGAs are not competitively performant for their logic density; while they can be made to perform like a 1996 game console which would otherwise require a PC from a decade-plus later to emulate in software, they cannot simultaneously access the spare compute cycles from that more modern PC. Right now it is unclear how much faster FPGAs can get, especially due to their relatively erratic development and niche use case outside of emulation; Intel probably meant to position them as an alternative to GPUs for high performance computing due to the unprofitability of the hobbyist market, but the current state of the art still favours GPUs for almost all such cases.
As of this writing, the most popular open source FPGA project is the MiSTer (https://github.com/MiSTer-devel/Main_MiSTer/wiki), which targets the popular DE10 board along with other relatively common components in order to get both analog and digital video output along with a few other GPIO pins. As with Retroarch, the FPGA code needed to emulate specific consoles is distributed as individual "cores" which can target stable APIs for input handling and audio/video output, and also like Retroarch, support for non-gaming platforms is steadily growing. But the motivation for hardware emulation over software emulation for these platforms is less clear, as access to other legacy software generally benefits more from portability (i.e., the ability to run on a server or in a browser rather than on a device with a comparable footprint to a Raspberry Pi) than from very precise timings.
# Videogames and hobbyists
That being said, emulator compatibility with the entire universe of Classic Macintosh or Amiga software has always been considerably less perfect than that of game consoles. This follows a common pattern: because legacy videogame hobbyists are so dedicated and enthusiastic and because videogame consoles have historically been "closed" platforms precluding self-publishing of software, it is relatively easy to query a catalog of all officially-released videogames for a given platform (as well as all international variants of those games), and if any of them are found not to work properly in popular emulators, these games will become notorious in the community until a fix is developed. Conversely, there is plenty of old Mac software that is probably unknown to us, and of the titles that are more or less available through typical emulation channels, a fair number are known not to quite work with the best-available emulators. FPGAs will likely help remedy this problem.
[1]: We will be referring to FPGA-driven access to legacy computing environments as emulation for simplicity's sake; whether or not it is technically reducible to hardware emulation is a semantic argument that we are uninterested in.