Steve Douglass, Vice President R&D at Lattice Semiconductor talks about how to manage the fast pace of technological change. This will also be his topic as keynote speaker at embedded world Conference in June.
What are the major trends in the embedded technology that affect the system design flow?
There has never been a more exciting time to be in tech and embedded technology is becoming deeply engrained into virtually all aspects of our world today. There are many trends driving this accelerated adoption of embedded technology, including the rapid expansion of IoT, the explosive growth of AI in all application domains, the rise of factory automation, and the shift toward autonomous driving. The increasing interconnectedness of our devices, paired with the integration of more intelligence and an insatiable hunger for more power efficient processing, requires us to reassess our system design flow, as well as our overall design mindset.
These applications require a level of flexibility in the system design itself that will challenge engineers to think differently. They need connectivity to both local networks and the cloud, and robust system security. They must have trained neural networks for intelligent, real-time decision making, in addition to power efficient computing that can keep pace with the ever-changing algorithms at the hardware, software, and application layer. It is imperative that these system capabilities be understood and contemplated when creating new designs, and that the systems are able to adapt over time as requirements will continue to evolve.
|See the conference program and get your ticket!|
What impact will AI and adaptive systems have on the market?
AI is and will continue to be one of the most disruptive technology applications ever. We have only just begun to scratch the surface of how to integrate intelligence into our devices and lives.
Today, this is happening mainly with technologies we are familiar with – our cars, our homes, PCs, phones, and workplace technology in office buildings, factories, and more. In the future, we can expect new use cases that will integrate intelligent devices into our lives in ways we have yet to imagine.
And, while all of these technologies will serve different purposes, all will have a common set of needs at their core – adaptability. Just as the use cases and devices themselves will continue to evolve, so must their underlying technology. As engineers, we will need to consider this beginning from the design phase so we build systems that can be easily updated post-deployment and ensure designs remain relevant as this rapid pace of innovation continues.
Is reconfigurable HW a tool for emerging technologies, that will transition to hard-wired platforms while the algorithms are maturing or do you see domains that are requiring reconfigurability forever?
Reconfigurable hardware is extremely beneficial for emerging technologies when architectures and algorithms are evolving rapidly. It provides flexibility for updating system hardware that is already in the field so that it can adapt to changes in situ. Many applications benefit from programmable hardware and leveraging that flexibility in the architecture of the system (e.g., Software Defined Radios, subscription-based services that change with time, etc.).
Reconfigurable hardware is also very useful for efficiently building product families with varying feature sets. For example, a single system can be configured into multiple product configurations to help minimize inventory levels and optimize manufacturing flows.
And, in general, reconfigurability and programmability are good for “future proofing” systems by providing a path for hardware upgrades throughout the life of the product. Fixed hardware, in comparison, would need to be physically replaced each time an update is required, which is time consuming, costly, and creates more waste as the replaced technology becomes e-waste.
Adaptability and flexibility are great, but many industries and applications need stability, or need certification. What is your view on this dilemma?
Adaptability and flexibility are not necessarily at odds with stability and certification. In fact, it can be quite the opposite. Application and market requirements change over time and, traditionally, this drives product evolution and upgrades. If flexibility and adaptability can be built into products such that they can address these changing requirements without replacing the entire product or system, it can actually be more stable as it extends the longevity of the product.
That said, flexibility and adaptability can add complexity to the product, which can add to the validation, productization, and certification time of the product when it is production released. However, the extra time and effort spent during productization can be more than offset by the benefits of extending the lifetime of the system through post-deployment updates.
FPGAs require special development tools and languages. How do you simplify the life of developers?
Software is a key pillar of our overall portfolio, and we offer an array of tools to make it easier for our customers to adopt and integrate our technology – ranging from development tools to application-targeted solution stacks that are pre-engineered solutions our customers can use to get to market quickly.
Our primary goal with our Radiant and Diamond FPGA development tools is to simplify the design process and maximize the productivity of the system developer. We do this by supporting industry standard HDL languages, timing constraints, and simulators, and providing sample code, reference designs, and evaluation boards and kits. Our Propel Embedded Design Environment also includes a Software Development Kit and a graphic based System Builder for easy construction of embedded hardware designs.
We want our customers to know that when they choose Lattice, they have all the tools and support they need to bring their design to life quickly and easily.
What is your expectation for virtual and for physical events in the future?
Personally, I have a strong preference for physical events as there is no substitute for meeting face-to-face to connect with people. That said, the pandemic has taught us that we can still be successful connecting with people and communicating with virtual events and it does offer the opportunity to participate in events you wouldn’t have been able to if in-person was the only option.
As with engineering in the future, I expect events will adapt. Going forward, I think it is safe to assume that events will become a hybrid of both mediums. The in-person element will be similar to what we experienced before the pandemic in many ways, supplemented by a virtual presence that draws on the best practices acquired over the last couple years for those unable to be there in person.