EDGE COMPUTING & AI
ABS enclosure, is one such module that measures 70x122x-
30mm and has power consumption of 1.2W at 24V DC.
Because Industry 4.0 requires all devices to be connected
to the same network, being able to power them all from a
single source is vital. Power over Ethernet technology makes
this possible, using the same network cable that transfers data
across devices in the network. This set-up can help to provide
substantial reductions in the cost of electrical installations,
as shown with the industrial gigabit ANTAIRA LNP-0500G-24
PoE switch (Figure 2) which has a metal housing, five access
ports and a built-in voltage booster. Manufactured by ANTAIRA
Technologies, a USA-based maker of industrial communication
solutions, the product is ideal for applications that demand a
high power PoE power source in harsh environments. These
include security surveillance, traffic monitoring systems, oil/gas
and mining applications, facilities management for power/utility,
water/wastewater treatment plants and automated production
lines in smart factories.
Conclusion
It’s clear that embracing solely cloud computing technology
when bringing IIoT into an environment of automation where
data is shared locally is no longer sufficient if costs are to be
kept to a minimum and companies are to become as productive
and efficient as possible. A combination of cloud computing
and edge computing is now the recognised optimum solution
and there are many robust and easy-to-use devices and products
on the market to turn this into reality. Whatever the industry
sector, and no matter how demanding the environment, edge
computing is now an essential factor when making the move
towards Industry 4.0.
The challenge at the edge of memory
By Jeff Lewis
The recent drive to bring intelligence to the edge, rather
than the cloud, has created a conundrum - hardware
constraints are beginning to cripple innovation. But new
memories are being developed to replace SRAM and DRAM to
solve this dilemma.
If data is the new oil, then artificial intelligence (AI) is what will
process data into a truly invaluable asset. It’s this belief that’s
causing the demand for AI applications to explode right now.
According to PwC and MMC Ventures, funding for AI startups
is rapidly increasing, reaching over $9 billion last year with tech
startups that have some type of AI component receiving up to
50% more funding compared to others. This intense investment
has led to rapid innovation and advances for AI technology. But
the traditional AI use model of “sweep it up and send it to the
cloud” is breaking down as latency or energy consumption can
make transmission impractical. Another major challenge is that
consumers are increasingly uncomfortable having their private
data in the cloud potentially exposed to the world.
For those reasons, AI applications are being pushed out of
their normal data-center environments, allowing their intelligence
to reside at the edge of the network. As a result, mobile
and IoT devices are becoming “smarter,” and a whole variety of
sensors—especially security cameras—are taking up residence
at the edge. However, this is where hardware constraints are
beginning to cripple innovation.
Increasing the amount of intelligence living at the edge requires
much more computational power and memory compared
to traditional processors. Studies have repeatedly shown that AI
model inference accuracy strictly relies on the amount of hardware
resources available. Since customers require ever-higher
accuracy—for example, voice detection has evolved to multifaceted
speech and vocal pattern recognition—this particular
problem only continues to intensify as the complexity increases
with these AI models.
One significant concern is simply the need for electrical
power. Arm has predicted that there will be 1 trillion connected
devices by the 2030s. If each smart device consumes 1W (security
cameras consume more), then all of these devices combined
will consume 1 terawatt (TW) of power. This isn’t simply
an “add a bigger battery” problem, either. For context, the total
generating capacity of the U.S. in 2018 was only slightly higher
at 1.2 TW. These ubiquitous devices, individually insignificant,
will create an aggregate power catastrophe.
Of course, the goal is to never let the power problem get to
that point. AI developers are simplifying their models, and hardware
power efficiency continually improves via Moore’s Law and
clever circuit designs. However, one of the major challenges remains
the legacy memory technology, SRAM and DRAM (static
and dynamic RAM, respectively). These memories are hitting
a wall on size and power efficiencies and now often dominate
system power consumption and cost.
The edge-computing conundrum
The core promise of AI is also its biggest challenge for the edge:
the model needs to be constantly adapting and improving. Not
only do AI models require a colossal amount of data and time to
learn, but they’re never truly “done.” If it was that simple, selfdriving
cars wouldn’t still have so much difficulty simply getting
out of a parking lot.
Even when AI models have been transitioned to the edge,
they still need to be capable of continuing to learn and de-
FAST PULSE TEST SOLUTIONS
Avtech offers over 500 standard models of high-speed pulsers,
drivers, and amplifiers. Some of our standard models include:
AVR-E3-B: 500 ps rise time, 100 Volt pulser
AVRQ-5-B: Optocoupler CMTI tests,
> 120 kV/us
AVO-8D3-B: 500 Amp, 50 Volt pulser
AV-1010-B: General purpose 100V / 1 MHz
pulser
AVO-9A-B: 200 ps rise, 200 mA laser
diode driver
AV-156F-B: 10 Amp current pulser for
airbag initiator tests
For datasheets, test results & pricing:
Jeff Lewis is Senior VP of Business Development at Spin
http://www.avtechpulse.com/
Memory - www.spinmemory.com
info@avtechpulse.com
Since 1975 tr = 450 ps
50 V/DIV
5 ns/DIV
AVR-E3-B
Nanosecond Electronics
www.eenewseurope.com eeNews Europe April 2020 News 35
link
/
/
/www.spinmemory.com
/www.eenewseurope.com