Welcome to Am-ra-stores.co.uk!

AI at the Silicon Level

[ad_1]

//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>

The idea of baking security into an application isn’t new in the software world, nor are security features in semiconductor technologies, such as memory. But the value of data, particularly in artificial-intelligence (AI) workloads, means hardware-enabled security is getting more attention.

Many networking and memory technologies have built-in security features — the “S” in SD card stands for secure, and SSDs have long had the ability to encrypt data. The key challenges for enabling hardware-level security features, however, are educating users on how to implement them and ensuring that security doesn’t hinder performance of the device and the overall system.

Although hardware-enabled security has been around for a while, securing AI workloads is a relatively new concept, said Carl Shaw, safety and security architect at Codasip, a company that focuses on processor design automation and RISC-V processor IP.

Hardware-enabled security technologies for AI

Securing AI can be broken down into two stages: security when training the network, which is more of an IT security issue, and security at the inference stage, which occurs when executing the network, according to Shaw. “This is the part that would most likely happen on a Codasip processor performing edge AI that we need to consider.”

Because AI algorithms are software, Codasip’s secure boot and CPU-level security mechanisms protect any software that runs on the device from tampering and IP theft, Shaw said. “Our fault protection technology will stop any corruption of the AI network while it is executing.”

Encryption, integrity protection and authentication IP will provide protection for the output from the AI model so it can be safely stored and forwarded, he added. “This is particularly important where privacy is required.”

But with CPUs already being taxed by the ever-increasing amount of microservices supporting containerized and virtualized apps spread across data centers, handling security just adds more pressure. Nvidia’s latest data processing unit (DPU), the BlueField-2, is used to offload, isolate, accelerate and secure data center infrastructure services so that CPUs and GPUs are free to focus on running and processing large volumes of workloads, including AI. The new DPU also reflects a trend toward hardware playing a key role in implementing zero-trust security in the data center and at the edge.

Nvidia’s new BlueField-2 DPU reflects a trend toward hardware playing a key role in implementing zero-trust security in the data center and the edge. (Source: Nvidia)

Other hardware-enabled security technologies that are getting more attention as of late are physical unclonable function (PUF) keys, due in part to increased need for encryption or a digital signature. A PUF is a physical object that, for a given input and conditions (otherwise known as a “challenge”), provides a physically defined “digital fingerprint” output that acts as a unique identifier, most often for a semiconductor device, such as a microprocessor. ReRAM manufacturer Crossbar recently announced it would apply its technology for use in hardware security applications.

PUF keys aren’t new, either, but the growth of the internet-of-things market has created many opportunities, and an increasing number of ASICs, microcontrollers (MCUs) and systems-on-chip (SoCs) are embedding hardware cryptographic accelerators or software cryptographic libraries. IoT devices at the edge are increasingly doing more AI workloads, too, which means IoT device makers need a way to protect keys and other secret material by encrypting or wrapping them with other keys — the root key must be kept secret and secret keys must be protected.

Intrinsic ID’s solution, for example, is an SRAM PUF-based key vault, which ensures that no unencrypted secrets are stored on-chip, while a root key is generated from an SRAM PUF when needed.

Intrinsic ID SRAM PUF-based key vault
Among Intrinsic ID’s security offerings are an SRAM PUF-based key vault, which ensures no unencrypted secrets are stored on-chip, while a root key is generated from an SRAM PUF when needed. (Source: Intrinsic ID)

Device vendors need security support

Security at the hardware level saw a tipping point around 2015, when IoT really started to take off, according to Intrinsic CEO Pim Tuyls. This is leading companies to be more proactive about security. “Once you start depending on all those devices around you and they get hacked, then it’s a big disaster.”

Even small nodes need security, but you can’t simply copy things that have been done for PCs, Tuyls added. “That’s where we see the growth.”

Tuyls said the company is working with many device vendors whose focus is on their core expertise and are looking to the company to help them build in security functionalities. For example, Intrinsic provides solutions for government, military and aerospace customers that place security at the top of their business needs, he noted.

Gowin Semiconductor’s SecureFPGA’s Root of Trust, for example, delivers an anchor of trust for security use cases using Intrinsic ID’s BK, a secure root key generation and management software solution for IoT security. It allows device manufacturers to secure their products with an internally generated, unique identity without the need for adding costly, security-dedicated silicon.

Similarly, Intrinsic’s QuiddiKey hardware IP solution enables device manufacturers and designers to secure their products with internally generated, device-unique cryptographic keys without additional silicon. Tuyls said the digital IP can be applied to almost almost any chip, ranging from tiny MCUs to high-performance SoCs.

Security must be at the hardware level because so many devices rely on chips to function — even if the chip doesn’t execute security-related tasks, it must be protected because it’s connected to something that is, he said. “It opens another door that should not have been opened.”

Compounding the problem is that it’s not always easy to see the connections and predict how things can go wrong if someone gains access to a chip. “It does not make sense to have a very strong lock on the front door and keep the back door of your house open,” Tuyls said. To further the analogy, you don’t want someone to break into the back door and easily find valuables laying around without an alarm going off.

Smart-home devices with basic AI are one segment that needs protection, as well as automotive, given the many chips and MCUs that are getting smaller every day. “You don’t want the software to be changed — strange things can happen in the car while you’re driving,” Tuyls said. This includes an attacker that could hack into a vehicle, in which case hardware-enabled data protection is crucial. “It needs to be strong enough; otherwise, we all end up in big trouble.”

Tuyls said there’s a bigger need for security at the hardware level because the impact is exponential if there’s a problem. Hardware-level security is also easier to add, although integration with software makes a strong, secure solution.

DRAM is the next security frontier

While corrupted AI can create havoc, the need to protect it is also driven by the fact that AI workloads have become valuable intellectual property for many companies. This is why Rambus has placed its focus on where AI data is crunched — DRAM. The company is an Intrinsic partner and has a wide range of root-of-trust solutions that offer anti-tamper and security techniques for IoT devices and data center applications.

Scott Best, director of anti-counterfeiting products and technologies in the security business unit at Rambus, told EE Times that securing memory for any workload, such as AI, is expansive because memory means many things, including low-density non-volatile types for scratch-pad applications or flash-based SSD in a server. “AI and machine learning [ML] is really going to be impacting DRAM security in a pretty transformational way.”

He said a DRAM security solution is a really difficult problem to solve, which is why it’s one of the last types of memory that people are willing to take on and apply security measures.

Security is critical, as DRAM is taking on some of the biggest and most intensive AI workloads, which involve large datasets that require a system’s off-chip memory. “High-performance memory systems are not only supporting very large datasets, but the value of those datasets is also orders of magnitude more valuable than they used to be in previous generations,” Best said. As an IP company, Rambus delivers circuits, hardware and software that customers are going to compile into an SoC or FPGA to execute an algorithm. “But now the data structure itself is the product,” he added.

Autonomous vehicles with training cameras that drive around neighborhoods are an excellent example, he said. Applications capable of all the compute and physical work needed to gather training sets so they can be distilled into a workload that an inference engine can execute on the edge are valued at tens of millions of dollars. This change in what’s considered valuable is also challenging people’s notions of security perimeters. “It now includes data that’s inside of off-chip DRAM memory.”

Best said this idea isn’t new for aerospace and defense industries, where there are some proprietary, highly classified algorithms in extremely secure systems. “U.S. defense has had a long-storied interest in DRAM security, but it’s now transitioning over to much larger consumer-scale markets because of this AI/ML transformation that we’re seeing now.”

He also said the risk to organizations is even higher given the value of this data. “It’s not just that adversaries are trying to penetrate a system, denial of service or learn the secrets they actually have — there’s actually strong value associated with that data.” If it can be stolen and used in a competitive system, there’s revenue at stake in the commercial world.

DRAM security is in its first generation of solutions, according to Best, while non-volatile memory is generally easier because the latency performance penalties are not there. SSDs have space in reserve, but DRAM systems don’t have any extra memory. A 128-GB module, for example, is exactly 128 GB.

“There’s no extra memory swimming around on that module that is easy to use,” Best said. The latency penalties are what make DRAM the most difficult type of memory to protect in a system, he added.

Memory provides a foundation of trust for the intelligent edge

Micron Technology clearly has skin in the game when it comes to making DRAM secure, but it’s also looking at the larger picture when it comes to securing AI/ML data with the recent introduction of its silicon root-of-trust solution, Authenta. The cloud-based security offering is aimed at securing the IoT and intelligent edge, and the company has enabled Authenta in a portfolio of industrial-grade Serial Peripheral Interface NOR (SPI-NOR) devices with increased density and packaging. Collaborators on the Authenta Cloud Platform include Swissbit AG and SanCloud.

Luis Ancajas, director of IoT security solutions and runner of the Authenta business unit at Micron, said the expansion of the Authenta portfolio extends hardware security and services to a broader range of devices to help bolster secure cloud services at the edge — with strong hardware security as a foundation.

Authenta uses strong cryptographic identity and secure element features directly embedded in the flash memory. “Trust is not given; it’s built,” Ancajas said. “What we’re doing is we’re building a very foundational piece rooted to silicon because silicon is the common denominator.”

Micron Authenta Cloud Platform
Micron’s Authenta Cloud Platform seamlessly integrates silicon security, platform integrity and purpose-built trust models enabled by flexible root of trust in flash to streamline security for the cloud and intelligent edge. (Source: Micron Technology)

Just as common is flash, Ancajas added, which is illustrated by how SanCloud devices can use Authenta to offer its customers a wide range of security features, including secure boot, golden image locking, memory block allocation, secure over-the-air updates and device integrity monitoring. Meanwhile, Swissbit is beginning to integrate Authenta into a microSD card ideal for retrofitting IoT systems.

For Authenta, the intelligent edge not only includes IoT, but even data centers, smart homes and automotive — the latter of which is becoming a platform for personalization for different end customers, whether it’s an individual car owner or a fleet owner. This is driving the need for security requirements that must be addressed at the hardware level and scale across a broad ecosystem, including the cloud, Ancajas said. “This is not an easy endeavor for many. It crosses the chasm of vendors.”

Enabling security spans system design, chip design, contract manufacturing and beyond, he said, which is why Micron built the Authenta platform — to streamline the process by playing the role of a security architect that glues programming and security functions together, including the necessary cloud APIs.

Ancajas said it was important to make sure the architecture was simple and flexible while aligning well with business models. Security is ultimately an enabler in revenue generation, and tying it to memory is key to flexibility.

He also distinguishes between protecting data and protecting the cloud. Authenta isn’t just about encrypting the data of an IoT device — it’s about authenticating it and being able to trust that there hasn’t been any compromise to the system. “Security is the underlying foundation. Trust is the actual element that it translates to allow you to deliver service.”



[ad_2]

We will be happy to hear your thoughts

Leave a reply

AM-RA-STORES
Logo
Compare items
  • Total (0)
Compare
0