Send Link mailing list submissions to
        [email protected]

To subscribe or unsubscribe via the World Wide Web, visit
        https://mailman.anu.edu.au/mailman/listinfo/link
or, via email, send a message with subject or body 'help' to
        [email protected]

You can reach the person managing the list at
        [email protected]

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Link digest..."


Today's Topics:

   1. Clouds .. time to open up the ROI calculator (Stephen Loosley)
   2. Carbon-based transistors instead of silicon ? extremely
      energy efficient (Stephen Loosley)


----------------------------------------------------------------------

Message: 1
Date: Fri, 06 Sep 2024 15:31:54 +0930
From: Stephen Loosley <[email protected]>
To: "link" <[email protected]>
Subject: [LINK] Clouds .. time to open up the ROI calculator
Message-ID: <[email protected]>
Content-Type: text/plain; charset="UTF-8"

Admins wonder if the cloud was such a good idea after all





As AWS, Microsoft, and Google hike some prices .. time to open up the ROI 
calculator





By Richard Speed? Wed 4 Sep 2024 & 128 comment 
shttps://http://www.theregister.com/2024/09/04/cloud_buyers_regret/

??

After an initial euphoric rush to the cloud, administrators are questioning the 
value and promise of the tech giants' services.



According to a report published by UK cloud outfit Civo, more than a third of 
organizations surveyed reckoned that their move to the cloud had failed to live 
up to promises of cost-effectiveness. Over half reported a rise in their cloud 
bill.



Although the survey, unsurprisingly, paints Civo in a flattering light, some of 
its figures may make uncomfortable reading for customers sold on the promises 
from hyperscalers.



Like-for-like comparisons for a simple three-node cluster with 200 GB of 
persistent storage and a 5 TB data transfer showed prices going from $1,278.58 
in 2022 to $1,458.68 in 2024 on Microsoft Azure.



For Google, the price went from $1,107.61 to $1,250.35. According to Civo's 
figures, the cost at AWS increased from $1,142.46 to $1,234.59.



"The Kubernetes prices were taken from the hyperscaler pricing calculators," a 
Civo spokesperson told The Register.



In the IT world, there is an expectation that bang for buck increases as time 
goes by, but in this example, prices are rising faster than the rate of 
inflation, and what customers receive for their money remains unchanged.



John David-Lovelock, VP analyst at Gartner, said CIOs had been conditioned not 
to expect price increases since the cloud emerged.



"Cost control, based on operating datacenters at massive scale, was part of the 
early sales pitch and in the intervening 15 years, it had proven out ? cloud 
product costs were stable, and either went down in price or more features were 
added at the same price," he told us.



"However, the rapid rise in the cost of electricity post-pandemic, coupled with 
the rising cost of skilled IT staff, put cloud delivery under new cost 
pressures that had to be passed on, from hyperscalers to platform provider, 
from platform provider to software provider, and finally from software 
providers to clients.



"While there are cost pressures behind these increases being felt across the 
cloud spectrum, opportunistic price increases cannot be ruled out."



Microsoft and Google decided not to officially comment on the survey findings. 
However, a representative for one of the hyperscalers retorted that the figures 
seemed cherry-picked and pointed out that, as an example, customers using 
reserved instances could realize significant savings.



In response to the suggestion that the figures had been "cherry-picked," a Civo 
spokesperson said: "The configuration we used ? a three-node cluster with 200 
GB Persistent Volume and 5 TB data transfer ? is one we've found to be commonly 
selected by our diverse customer base. While we understand that no single setup 
can represent every use case perfectly, we believe this configuration offers a 
helpful reference point for many potential customers."



An AWS spokesperson sent us a statement: "IT providers often tout their pricing 
in direct comparison to AWS, which encourages further price competition. AWS 
has reduced prices 134 times since AWS launched in 2006.



"These price reductions have occurred even as AWS has continuously improved 
reliability, availability, security, and performance. In addition, AWS offers 
management tools that make it easier for customers to monitor and optimize 
their cloud costs."



Despite such protestations, analysts have long predicted an increase in public 
cloud prices. In 2022, Canalys warned that prices could jump by a third, and 
several companies have begun to question the cost of operating services in the 
cloud compared to running on-premises.



But is a retreat from the cloud likely? David-Lovelock thinks not: "CIOs cannot 
turn their back on cloud."



The giddy enthusiasm might have waned in favor of some hard-nosed ROI 
calculations, and some workloads might jump away from cloud vendors, "but this 
will not constitute a change in direction ? just a ripple in the stream of 
dollars flowing to the cloud."



So, are prices increasing? The answer has to be yes. How much of those rises 
are down to the major vendors opportunistically adding of a few percentage 
points versus an increase in fixed costs, such as electricity, is pretty much 
irrelevant. The advice remains the same: the cloud is here to stay although its 
luster has dulled over time.



Time, then, to wheel out the ROI calculator and ensure there's been no stealthy 
vendor lock-in.



All clouds and all workloads are, after all, not created equal.



---

------------------------------

Message: 2
Date: Fri, 06 Sep 2024 21:42:02 +0930
From: Stephen Loosley <[email protected]>
To: "link" <[email protected]>
Subject: [LINK] Carbon-based transistors instead of silicon ?
        extremely energy efficient
Message-ID: <[email protected]>
Content-Type: text/plain; charset="UTF-8"

Specialist 'carbon nanotube' AI chip built by Chinese scientists is 1st of its 
kind and '1,700 times more efficient' than Googles

By Owen Hughes 4th Sep 2024? 
https://www.livescience.com/technology/electronics/specialist-carbon-nanotube-ai-chip-built-by-chinese-scientists-is-1st-of-its-kind-and-1700-times-more-efficient-than-googles-version


Scientists in China have developed a tensor processing unit (TPU) that uses 
carbon-based transistors instead of silicon ? and they say it's extremely 
energy efficient

[Image caption: Unlike conventional TPUs, this new chip is the first to use 
carbon nanotubes ? tiny, cylindrical structures made of carbon atoms arranged 
in a hexagonal pattern ? in place of traditional semiconductor materials like 
silicon. Image credit: Getty Images/sankai]


Scientists in China have built a new type of tensor processing unit (TPU) ? a 
special type of computer chip ? using carbon nanotubes instead of a traditional 
silicon semiconductor. They say the new chip could open the door to more 
energy-efficient artificial intelligence (AI).


AI models are hugely data-intensive and require massive amounts of 
computational power to run. This presents a significant obstacle to training 
and scaling up machine learning models, particularly as the demand for AI 
applications grows.

This is why scientists are working on making new components ? from processors 
to computing memory ? that are designed to consume orders of magnitude less 
energy while running the necessary computations.

Google scientists created the TPU in 2015 to address this challenge.

These specialized chips act as dedicated hardware accelerators for tensor 
operations ? complex mathematical calculations used to train and run AI models. 
By offloading these tasks from the central processing unit (CPU) and graphics 
processing unit (GPU), TPUs enable AI models to be trained faster and more 
efficiently.

Unlike conventional TPUs, however, this new chip is the first to use carbon 
nanotubes ?? tiny, cylindrical structures made of carbon atoms arranged in a 
hexagonal pattern ? in place of traditional semiconductor materials like 
silicon. This structure allows electrons (charged particles) to flow through 
them with minimal resistance, making carbon nanotubes excellent conductors of 
electricity.

The scientists published their research on July 22 in the journal Nature 
Electronics.
https://www.nature.com/articles/s41928-024-01211-2.epdfwwwa

According to the scientists, their TPU consumes just 295 microwatts (?W) of 
power (where 1 W is 1,000,000 ?W) and can deliver 1 trillion operations per 
watt ? a unit of energy efficiency. By comparison, Google?s Edge TPU can 
perform 4 trillion operations per second (TOPS) using 2 W of power.

This makes China?s carbon-based TPU nearly 1,700 times more energy-efficient.

"From ChatGPT to Sora, artificial intelligence is ushering in a new revolution. 
But traditional silicon-based semiconductor technology is increasingly unable 
to meet the processing needs of massive amounts of data," Zhiyong Zhang, 
co-author of the paper and professor of electronics at Beijing?s Peking 
University, told TechXplore.

Zhiyong Zhang, "We have found a solution in the face of this global challenge."

The new TPU is composed of 3,000 carbon nanotube transistors and is built with 
a systolic array architecture ? a network of processors arranged in a grid-like 
pattern.

Systolic arrays pass data through each processor in a synchronized, 
step-by-step sequence, similar to items moving along a conveyor belt.

This enables the TPU to perform multiple calculations simultaneously by 
coordinating the flow of data and ensuring that each processor works on a small 
part of the task at the same time.


This parallel processing enables computations to be performed much more 
quickly, which is crucial for AI models processing large amounts of data. It 
also reduces how often the memory ? specifically a type called static 
random-access memory (SRAM) ? needs to read and write data, Zhang said. By 
minimizing these operations, the new TPU can perform calculations faster while 
using much less energy.


To test their new chip, the scientists built a five-layer neural network ? a 
collection of machine learning algorithms designed to mimic the structure of 
the human brain ? and used it for image recognition tasks. The TPU achieved an 
accuracy rate of 88% while maintaining power consumption of only 295 ?W.

In the future, similar carbon nanotube-based technology could provide a more 
energy-efficient alternative to silicon-based chips, the researchers said.

The scientists plan to continue refining the chip to improve its performance 
and make it more scalable, they said, including by exploring how the TPU could 
be integrated into silicon CPUs.


--
Owen Hughes is a freelance writer and editor specializing in data and digital 
technologies. Previously a senior editor at ZDNET, Owen has been writing about 
tech for more than a decade, during which time he has covered everything from 
AI, cybersecurity and supercomputers to programming languages and public sector 
IT. Owen is particularly interested in the intersection of technology, life and 
work ?? in his previous roles at ZDNET and TechRepublic, he wrote extensively 
about business leadership, digital transformation and the evolving dynamics of 
remote work.

--




------------------------------

Subject: Digest Footer

_______________________________________________
Link mailing list
[email protected]
https://mailman.anu.edu.au/mailman/listinfo/link


------------------------------

End of Link Digest, Vol 382, Issue 3
************************************

Reply via email to