Connect with us

Innovation

PyTorch Foundation Makes Generally Available PyTorch 2.0

Avatar photo

Published

on

The PyTorch Foundation has announced the general availability of PyTorch 2.0, an open-source platform for machine learning (ML) training that’s among the most widely used technologies in the field.

PyTorch 1.0 was first launched in 2018 and benefited from years of incremental improvements before the beta of PyTorch 2.0 went into preview in December 2022. The effort to enable more open governance and encourage more collaboration and contributions paid dividends with PyTorch benefits from new code provided by 428 contributors.

One key feature on offer is Accelerated Transformers, formerly known as “Better Transformers”, at the heart of modern Large Language Models (LLMs) and generative AI. One specifically exciting thing is that it gives a speedup between 1.5x and 2.x in training Transformers models whilst also making it easier for developers to work on state-of-the-art transformer models quickly.

New kernel architectures allow for high-performance support when using scaled dot product attention (SPDA). These can be supported by multiple types of hardware, each with custom kernels while integrating custom logic that will pick the highest-performing kernel for any given model or hardware type. This accelerates developers’ ability to train models faster than previous iterations of PyTorch allowing just one line of code to be added enabling a game-changing speedup.

Intel helped lead work on improving PyTorch for CPUs even though AI and ML work is often aimed at GPUs with TorchInductor CPU optimizations helping run new features present within the compiler version released as part of PyTorch 2.0 on CPUs without sacrificing performance.

“The end-user benefit is they just select a single CPU backend, with best performance and best portability,” said Arun Gupta VP and GM Open Ecosystems at Intel who noted how compilation as a powerful technology helps users perform well.”

For more information check out the source article

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Innovation

New electric current treatment offers hope for prostate cancer patients

Avatar photo

Published

on

Prostate cancer could be treated with a revolutionary one-hour procedure using electrical currents to destroy hard-to-reach tumours, according to leading surgeons. The new technique, called irreversible electroporation or Nanoknife, has recently been used for the first time in the NHS and provides hope for thousands of men who face surgery or radiotherapy with distressing side-effects.

The NHS has now carried out six operations using the procedure, which is minimally invasive and precise. Surgeons are optimistic about its potential as a standard treatment for certain types of prostate cancer. Unlike traditional treatments that can cause urinary problems or loss of sexual function, the Nanoknife method can be carried out quickly and result in fewer side effects by administering short electrical pulses into tumours, significantly reducing risks to healthy tissues surrounding it.

Charities have praised this “amazingly simple” treatment that could help up to 20,000 men diagnosed with localised disease each year. The Nanoknife technique is among other focused therapies – such as cryotherapy and focused ultrasound – available only at major specialist centres. The speed and efficiency of day cases procedures also makes them much more attractive within an already overwhelmed system.

The use of targeted electronic therapies opens new doors into the fight against one of the most common cancers among UK men. It is expected that similar techniques will be trialled soon so they can become more widely used across hospitals in England as part of efforts to tackle a huge cancer backlog made worse by Covid-19 related delays.

For more information check out the source article

Continue Reading

Innovation

The Importance of Standardizing AI Infrastructure for Successful Adoption

Avatar photo

Published

on

Artificial Intelligence (AI) is transforming organizations, but to gain widespread adoption and spark innovation requires standardization, cost management, and governance. Unfortunately, many enterprises struggle with all three. The sprawl of diverse tools and technologies across organizations often results in inconsistent experiences between groups and scaling pilots into production can be difficult.

Managing AI costs remains a major challenge for many IT leaders as new projects can rapidly grow out of control. Selecting, building, and integrating robust infrastructure needed for AI can become a budget buster, especially in on-premises environments. As for governance, companies often silo their AI efforts or spread them out across teams without sufficient oversight from IT.

The Power of “AI-first” Infrastructure

A purpose-built, end-to-end optimized AI environment based in the cloud can address all three requirements effectively. Standardizing on clouds, tools and platforms such as NVIDIA AI Enterprise replaces an eclectic sprawl of diverse technologies across the organization with an optimized end-to-end environment that works together seamlessly – much like standardized virtualization or database solutions.

An enterprise’s workload determines the optimization of its infrastructure; smaller footprints lower costs dramatically while still accelerating processing of AI workloads by reducing training times through economies-of-scale purchasing and integration.

Simplifying Cost Control and Governance

The simplified management offered by standardized platforms also brings about culture changes centered around innovation within a company because it opens up access to more teams who can conceptualize their own ideas within a secure ecosystem. With full visibility into who is making purchases and what they are buying alongside crucial metrics including cost audits provided with each platform’s solution enables IT to retake overall spending accountability for auditability and regulatory compliance while safeguarding the confidentiality of business-critical data against bad actors.

Making AI Accessible Across The Organization

Dedicated standardized platforms help accelerate time-to-market by halting chaos-driven reinvention projects before they start because this unified approach empowers more people to use AI without starting from scratch every time they do so. It’s critical for businesses looking toward the future wave of technology need bet on an all-in-one platform that will deliver across the board if they want to stay ahead competitively among industry peers – so find out how dedicated infrastructure unlocks innovation at any level within your enterprise today!

For more information check out the source article

Continue Reading

Innovation

TypeScript 5.0 is here!

Avatar photo

Published

on

Developers rejoice! Microsoft has released TypeScript 5.0, bringing with it numerous improvements and features to the popular programming language. TypeScript builds on JavaScript by adding syntax for types that can be used for type-checking, resulting in greater code accuracy and reliability.

This release includes new decorators standard implementation, improved functionality to better support ESM projects in Node and bundlers, better control of generic inference for library authors, expanded JSDoc functionality, simplified configuration and other significant changes to make TypeScript smaller and faster.

But don’t worry if you’re already familiar with TypeScript – the update will not require developers to relearn how to use it as nothing has been significantly changed or removed. While there are a few deprecations for less-used options in version 5.0, upgrading should be no different than previous updates. To get started using TypeScript 5.0, you can obtain it through NuGet or npm.

What’s new

The release marks a significant milestone as it brings many changes such as case-insensitive import sorting improvements in editor scenarios which now work better with existing tooling by default; specifies a minimum required version of Node.js (12.20) in its package.json; supports decorators placed before or after export and export default; permits bundler module resolution option when –module option is set to esnext; and more.

Decorators

TypeScript also introduces decorators – an upcoming ECMAScript feature allowing customization of classes and their members in a reusable way while keeping everything DRY (don’t repeat yourself). By defining functions like “loggedMethod”, we can apply them to existing methods like “greet” without modifying the actual greet method every time we want to add logging before or after the method executes making modularization easy than ever before.

For more information check out the source article

Continue Reading

Trending