
Digital electronics has profoundly changed musical instrument design. From toy keyboards to performance-grade pianos, synthesizers, and drum sets, to name a few, instruments that once would have been finely crafted wood and metal can today find their voices in CPUs, memory, and data converters.
This does not mean craftsmanship is dead. There is as much skill, experience, and love of music in the intellectual property (IP) inside today’s electronic instruments as in the workshop of a traditional piano maker or luthier. It is just expressed differently. A look inside an instrument will illustrate this point.
A generic architecture
A concert grand piano, an early analog synthesizer, a drum set, and a clarinet could hardly look less alike. Yet, functionally, the digital electronic versions of all these instruments can share a single block diagram and signal flow. Figure 1 displays the block diagram of an ASIC inside electronic musical instruments.
Figure 1 The ASIC for electronic musical instruments is shown with its key building blocks. Source: Faraday Technology Corp.
One or more input devices capture the musician’s intent. This could be just a row of membrane switches for a toy keyboard. A professional piano might have a position sensor on each key and pedal. An electronic version of a clarinet might mean a pressure or velocity transducer and position sensors on the keys. A synthesizer might have a microphone for voice input.
The choice of sensors must both meet cost guidelines and capture what is essential about the musician’s actions at that price point. This must include subtilities, such as a pianist’s attack and graduated use of pedal or a saxophonist’s voicing and modulated use of the keys.
The analog sensor signals will pass through signal-conditioning circuitry and into analog-to-digital conversion. The resulting digital signal streams—which, at this point, represent the musician’s actions, not sounds—will go into a digital subsystem. This subsystem will generally comprise a CPU, usually a digital signal processor, memory, I/O interfaces, and a great deal of software and stored data.
This block not only interprets the incoming sensor data and controls the rest of the instrument but also combines sensor input with sampled or algorithmically generated audio waveforms and shapes these waveforms to produce a digital audio output stream. It is here that the craftsmanship happens.
The digital audio signal may be sent to external devices via an interface such as USB or passed on to digital-to-analog conversion and then to an audio amplifier.
A range of solutions
This description fits various instrument types, levels of sophistication and performance, and price points (Figure 2). In principle, the only differences between digital instruments are the input devices and the software. But the reality is more complex than that.
Figure 2 The above chart highlights three keyboard market segments. Source: Faraday Technology Corp.
Both engineering expertise and musical knowledge are used in the design decisions that produce different types of instruments. What kinds of sensors, and where? What type of analog-to-digital converter (ADC) should be employed, how many channels should be used, and what is the sample rate? What will be the tasks for the CPU and DSP, and consequently, how powerful must each be?
What are the necessary resolution, sample rate, noise level, and distortion of the digital-to-analog converter (DAC)? These choices, along with the software design and the vendor’s extensive library of sound samples, will set the instrument’s personality, whether the child’s toy or concert paragon.
Design implementation
The obvious way to implement the electronic portion of these musical instruments is with a discrete data converter, microcontroller, DSP, and memory chips. This approach allows for a fast time to market and will enable designers to select just the right chip for the intended performance level. It also allows the design team to focus most of their effort on the software from which the instrument’s character will emerge.
However, at least three issues exist with using discrete, off-the-shelf ICs for anything less than a premium professional instrument. First, suppose the organization intends to market a range of instruments at different price points. In that case, the discrete approach will lead to a proliferation of bills of materials (BOM) and board designs, complicating supply-chain management. Worse, it will require several software versions, each of which must be maintained and kept coherent.
Second, using discrete chips will make protecting proprietary software IP from theft difficult. All the pins on the chips are exposed to probing, allowing competitors to watch the operation of the digital electronics and even use diagnostic tools to examine memory through code. Further, the choice of ICs in the design will be visible, if not on the package lids, then on inspection of the dies inside.
Third, sophisticated designs may rely on proprietary hardware—especially in the data converters and the DSP core—to achieve the price/performance point intended at the high end of the product family. Duplicating these special hardware features in off-the-shelf chips may not be possible without carrying out massive overdesign.
Taking the ASIC path
These considerations have led some musical instrument design teams to employ a mixed-signal ASIC (Figure 3). An audio ASIC answers each of the three problems of a discrete design while serving as the foundation of digital electronic instrument designs.
Figure 3 The musical instruments ASIC is segmented into 186 MHz (left) and 192 MHz (right) domains. Source: Faraday Technology Corp.
First, the unit cost of an ASIC for these applications will be low enough that the same chip can be used across a broad product line, often without changing the board design. That cost may be lower than the total cost of discrete chips, especially once inventory, assembly, and test costs are included. A modular approach to software design and test design can allow one version of the software and one test bench to serve all the products in the family. This hugely simplifies debug and life-cycle management.
Second, the ASIC’s data paths and circuits are inside the die, safe from all but the most determined examination. The exception would be external code and audio sample memory. However, these can be encrypted, with the ASIC providing hardware-based encryption and decryption, so the software and data crown jewels are never exposed to the outside world in unencrypted form.
Third, suppose the developers have proprietary circuit designs for audio signal paths, a unique DSP architecture, or even a favorite CPU core. In that case, these can be implemented in the ASIC without concern for whether they are available off the shelf for the entire life of the product family.
However, there is an obvious objection to choosing an ASIC: musical instrument designers rarely have entire internal ASIC design teams. They are unlikely to want to assemble such a team for one project. Nor do they have a network of relationships with silicon IP providers, chip foundries, and outsourced assembly and test houses. These relationships turn a chip design into a reliable stream of finished chips. This is where a flexible, full-range ASIC partner comes in.
An ASIC case study
To show the importance of a partner, let’s look at a representative, composite example of an ASIC engagement. Faraday began discussions with a globally known musical instrument manufacturer. In addition to documenting the desired gross architecture, performance, and features, the initial conversation covered many of the points we have just discussed.
This organization was quite sophisticated in digital audio design, with its own DSP algorithms, logic designs for some critical digital functions, and precise specifications for mixed-signal functions. On the other hand, Faraday drew upon its internal IP libraries and extensive network of third-party IP vendors to gather the non-proprietary blocks, including an ARM CPU subsystem, memory and communications interfaces.
Next, Faraday determined that the design could meet the music company’s demanding digital/analog converter requirements with available IP, eliminating the need for an external DAC. Further, Faraday worked with the instrument designers to produce an optimized netlist for a DSP core optimized to the music company’s algorithms.
Faraday then took the chip design through the customary ASIC design flow of IP integration, functional verification, synthesis, and mixed-signal integration. At that point, it stepped in to complete the back-end design, conferring with the instrument design team when necessary, and taped out to the foundry the two partners had jointly selected.
About 18 months after the initial engagement, the musical instrument company received working silicon from the assembly and test vendor Faraday had arranged. A flexible engagement such as this can make an ASIC design the realistic best choice for a musical instrument or another such electronic product.
Kevin Kai-Wen Liu is a project manager at Faraday Technology Corp.’s headquarters in Hsinchu, Taiwan.
Related Content
- Arrow and Avnet launch ASIC design services
- Making ASIC power estimates before the design
- Alchip Technologies Offers 3nm ASIC Design Services
- An FPGA-to-ASIC case study for refining smart meter design
- Dentressangle Capital to Acquire ASIC Designer Presto Engineering
The post Electronic musical instruments design: what’s inside counts appeared first on EDN.