Categories
Misc

Dive into the Future of Graphics with NVIDIA Omniverse On-Demand Sessions

NVIDIA Omniverse is bringing the new standard in real-time graphics for developers. Check out some of the resources on the NVIDIA On-Demand catalog to learn more tips and tricks for developing in Omniverse.

NVIDIA Omniverse is bringing the new standard in real-time graphics for developers. Teams across industries are now using the open, cloud-native platform to deliver new levels of virtual collaboration and photorealistic simulation to their projects. And with open beta availability recently announced, more developers around the world can experience Omniverse and explore ways to integrate technologies or connect applications. 

Check out some of the resources on the NVIDIA On-Demand catalog to learn more tips and tricks for developing in Omniverse:

Getting Started with Omniverse Launcher: Learn more about the Omniverse Launcher as this session covers installation and configuration, as well as an overview of how to install applications and connectors. 

Omniverse Create Overview: Learn how Omniverse Create accelerates advanced scene composition and allows users to assemble, light, simulate, and render complex USD scenes in real time.

Omniverse View Overview:This session is an introduction to Omniverse View, an application created specifically  for architecture, engineering, and design professionals.

What Makes USD Unique: USD is the backbone of the Omniverse collaboration technology; in this video we discuss Pixar’s USD file format, explains the basics of its structure, and introduces layers, references and sublayers.

Omniverse Five Things to Know About Materials: This talk shows users where to find and how to interact with materials in Omniverse Create, how to create and import your own MDL materials, and how to convert materials into Omniverse.

Intro to Omniverse Unreal Engine 4 Connector: Get a brief introduction into the Omniverse Unreal Engine 4 (UE4) Connector, which consists of two plugins — a USD and an MDL plugin. This connector lets creators live link Omniverse Applications (like View and Create) with UE4.

Deep Dive into Omniverse Kit: Get an introduction to Omniverse Kit and learn how developers can leverage this powerful toolkit to create new Omniverse Apps and extensions.

Download Omniverse today and check out other Omniverse sessions on the NVIDIA On-Demand portal.

Categories
Misc

NVIDIA Expands vGPU Software to Accelerate Workstations, AI Compute Workloads

Designers, engineers, researchers, creative professionals all need the flexibility to run complex workflows – no matter where they’re working from. With the newest release of NVIDIA virtual GPU (vGPU) technology, enterprises can provide their employees with more power and flexibility through GPU-accelerated virtual machines from the data center or cloud. Available now, the latest version Read article >

The post NVIDIA Expands vGPU Software to Accelerate Workstations, AI Compute Workloads appeared first on The Official NVIDIA Blog.

Categories
Misc

A Sense of Responsibility: Lidar Sensor Makers Build on NVIDIA DRIVE

When it comes to autonomous vehicle sensor innovation, it’s best to keep an open mind — and an open development platform. That’s why NVIDIA DRIVE is the chosen platform on which the majority of these sensors run. In addition to camera sensors, NVIDIA has long recognized that lidar is a crucial component to an autonomous Read article >

The post A Sense of Responsibility: Lidar Sensor Makers Build on NVIDIA DRIVE appeared first on The Official NVIDIA Blog.

Categories
Misc

NVIDIA Announces Nsight Graphics 2021.1

Nsight Graphics 2021.1 is available to download – check out this article to see what’s new.

Nsight Graphics 2021.1 is available to download.

We now provide you with the ability to set any key to be the capture shortcut. This new keybinding is supported for all activities, including GPU Trace. F11 is the default binding for both capture and trace, but if you prefer the old behavior, the original capture keybinding is still supported (when the ‘Frame Capture (Target) > Legacy Capture Chord’ setting is set to Yes).

You can now profile applications which use D3D12 or Vulkan strictly for compute tasks using the new ‘One-shot’ option in GPU Trace. Tools that generate normal maps or use DirectML for image upscaling can now be properly profiled and optimized.  To enable this, set the ‘Capture Type’ to ‘One-shot [Beta]’

While TraceRays/DispatchRays has been the common way to initiate ray generation, it’s now possible to ray trace directly from your compute shaders using DXR1.1 and the new Khronos Vulkan Ray Tracing extension. In order to support this new approach, we’ve added links to the acceleration structure data for applications that use RayQuery calls in compute shaders.  

It’s important to know how much GPU Memory you’re using and to keep this as low as possible in Ray Tracing applications. We’re now making this even easier for you by adding size information to the Acceleration Structure Viewer.

Finally, we’ve added the Nsight HUD to Windows Vulkan applications in all frame debugging capture states. Previously the HUD was only activated once an application was captured.

We’re always looking to improve our HUD so please make sure to give us any feedback you might have.

For more details on Nsight Graphics 2021.1, check out the release notes (link).

We want to hear from you! Please continue to use the integrated feedback button that lets you send comments, feature requests, and bugs directly to us with the click of a button. You can send feedback anonymously or provide an email so we can follow up with you about your feedback. Just click on the little speech bubble at the top right of the window.

Try out the latest version of Nsight Graphics today!

Khronos released the final Vulkan Ray Tracing extensions today. NVIDIA Vulkan beta drivers available for download. Welcome to the era of portable, cross-vendor, cross-platform ray tracing acceleration! 

And be sure to check out the final Vulkan Ray Tracing extensions from the Khronos Group as well!  

Categories
Misc

Certifiably Fast: Top OEMs Debut World’s First NVIDIA-Certified Systems Built to Crush AI Workloads

AI, the most powerful technology of our time, demands a new generation of computers tuned and tested to drive it forward. Starting today, data centers can get boot up a new class of accelerated servers from our partners to power their journey into AI and data analytics. Top system makers are delivering the first wave Read article >

The post Certifiably Fast: Top OEMs Debut World’s First NVIDIA-Certified Systems Built to Crush AI Workloads appeared first on The Official NVIDIA Blog.

Categories
Misc

TensorFlow Lite Support


TensorFlow Lite Support
submitted by /u/nbortolotti

[visit reddit]

[comments]
Categories
Misc

Is my data actually predictive

Hey, Very much a beginner with tensorflow, but been enjoying it
so far.

Background: response between 0-200, have 43 variables,
regression type problem, data set is over 200k rows

I’ve built a basic sequential model using Keras, and my loss
and validation loss are ideal – I.e validation loss is slightly
above loss, and it looks as it should.

However my actual loss seems quite high, it is converging around
34 and I’d have liked it to be around 20, now because of the
above I’m not sure if this means my data isn’t actually
predictive?!

I have standardised many variables rather than normalised, I’m
not sure if this would make any difference.

Is there anything I could add you think? I don’t think the
data set is lacking particularly woth the dimensions.

submitted by /u/Accomplished-Big4227

[visit reddit]

[comments]

Categories
Misc

[Help Please] Applying an already trained model to an image

Hello,

I am new to tensorflow and am trying to figure out what I think
should be a rather simple task. I have a model (.pb file) given to
me and I need to use it to markup an image.

I have two classes that the model was trained on: background and
burnish.

From this point on, I have literally no idea what I am doing. I
tried searching online and there is a lot about how to train a
model but I don’t need to do be able to do that.

Any help pointing me in the right direction would be
awesome!

submitted by /u/barrinmw

[visit reddit]

[comments]

Categories
Misc

Upcoming Webinars: Learn About the New Features of JetPack 4.5 and VPI API for Jetson

JetPack SDK 4.5 is now available. This production release features enhanced secure boot, disk encryption, a new way to flash Jetson devices through Network File System, and the first production release of Vision Programming Interface.

JetPack SDK 4.5 is now available. This production release features enhanced secure boot, disk encryption, a new way to flash Jetson devices through Network File System, and the first production release of Vision Programming Interface. 

For AI embedded and edge developers, the latest update for NVIDIA JetPack is available. It includes the first production release of Vision Programming Interface (VPI) to accelerate computer vision on Jetson. Visit our download page to learn more.

This production release features:

  • Enhanced secure boot and support for disk encryption
  • Improved Jetson Nano™ bootloader functionality
  • A new way of flashing Jetson devices using network file system 
  • V4l2 api extended to support CSI cameras

Download now >>

Upcoming Webinars

The Jetson team is hosting two webinars with live Q&A to dive into JetPack’s new capabilities. Learn how to get the most out of your Jetson device and accelerate development. 

NVIDIA JetPack 4.5 Overview and Feature Demo
February 9 at 9 a.m. PT

This webinar is a great way to learn about what’s new in JetPack 4.5. We’ll provide an in-depth look at the new release and show a live demo of select features. Come with questions—our Jetson experts will be hosting a live Q&A after the presentation.

Register now >> 

Implementing Computer Vision and Image Processing Solutions with VPI
February 11 at 9 a.m. PT

Get a comprehensive introduction to VPI API. You’ll learn how to build a complete and efficient stereo disparity-estimation pipeline using VPI that runs on Jetson-family devices. It provides a unified API to both CPU and NVIDIA CUDA algorithm implementations, as well as interoperability between VPI and OpenCV and CUDA.
Register now >>

Categories
Misc

Anyone have a good example/ tutorial for TF attention/ transformers from scratch?

I am have searched a lot of tutorials and courses, most start
with a BERT model or some variation of it. I want to watch/ learn
how a transformer/ attention is trainned from scratch.

I want to try to build a attention/ transformer model for solved
games like chess, (ie I will have generate-able data)

submitted by /u/Ok_Cryptographer2209

[visit reddit]

[comments]