Connect with us

AI

The world’s first dental robot started working

Published

on

World's first dental robot Yomi

This is the first dental robot approved by the US Food and Drug Administration (FDA), which makes implant placement safer.

Can a robot make dentistry faster and easier? According to the makers of a new robot called Yomi, this is what this new robot intends to do in the field of implants.

The inventors of this robot say: We created Yomi to deal with one of the most challenging dental procedures, implant placement, which people are most afraid of.

The first and only FDA-approved dental surgical robot

Yomi is the first and only robotic system approved by the US Food and Drug Administration (FDA) for dental surgery. This robot helps doctors place implants in the patient’s mouth with greater accuracy and precision through a combination of tactile feedback technologies, visual visualization, and audio cues.

Yomi’s inventors also state that patients who choose the Yomi-assisted implant procedure can walk out of the dentist’s office with a restored smile in one session and on the same day. This is even though such a process takes months with the current conventional methods.

It is stated on the manufacturer’s website: Yomi is a computer robotic guidance system that helps doctors in the stages before and during dental implant surgery. This system includes a robotic arm that is always controlled and guided by the doctor, a tracker arm that connects to the patient and follows his position like GPS in real-time, and also an easy-to-use software package called YomiPlan.

Yomi is a dental robot

World's first dental robot Yomi

The Yomi robot combines precise digital programming and multisensory guidance of surgical instruments. In this way, this device helps dentists to place implants with a high degree of precision, efficiency, and safety.

Read more : Uber replaces Human drivers with robots

Surgeries performed with Yomi involve smaller, suture-free incisions. Therefore, this robot results in less pain for patients and, at the same time, a much faster recovery time.

Great benefits

Reviews from patients and doctors about this robot show Yomi’s impressive performance and superior handling. As one patient says: It’s technically absolutely amazing. No one wants to go to the dentist until they have to, but I can even say that my experience with Yomi was pleasant.

Addressing the inventors of this robot, he said: This surgery was painless for me, and most importantly, I learned the technical aspect behind it with your help. You guys are great. I appreciate it.

He added: “My dental surgeon was able to do all these things at once and in one meeting because of the presence of a day, and he bought four months of my time and also saved the cost of visiting for the next appointments.”

Another patient also said that I was very satisfied with this experience.

“You can feel the technology and the quality of the device,” says Jody Griffin, a dentist. Its tactile guidance is incredible. This bot is like nothing else I’ve ever experienced.

Dental specialist Scotty Boulding also says: I have been doing implants for over 30 years, and there is no doubt that Yomi meets all the standards and allows for multiple implants.

Another dentist named Riyad Ansari says that Robot-guided treatment is more predictable than manual surgery, avoids the drawbacks of fixed guides, and improves the patient experience by increasing confidence in therapeutic results.

Finally, it should be mentioned that Alon Mozes and Juan Salcedo invented Yomi.

Moses has worked in computer graphics and image guidance applications at startups for nearly 20 years, while Salcedo has more than a decade of experience in mechanical design and manufacturing for the medical device industry.

AI

Artificial intelligence builds cities better than humans

Published

on

By

Artificial intelligence builds cities better than humans

Artificial intelligence builds cities better than humans. AI study programs can design better cities than humans.

Artificial intelligence builds cities better than humans

Imagine living in a green and cool city, full of parks and walking paths, bike paths, and dedicated bus routes that take people to shops, schools, and services in minutes.

This dream is the embodiment of urban planning and in a way the definition of utopia or the concept of a 15-minute city, where all the basic needs and services are available in a quarter of an hour, public health is a priority, and there is no excessive pollution from cars.

Artificial intelligence can help urban planners better understand this insight.

Read More: Artificial intelligence smells better than humans

A new study from researchers at Tsinghua University in China shows how machine learning technology can create more efficient spatial layouts than humans in a fraction of the time.

Automation scientist Yu Zheng and his colleagues wanted to find new solutions to improve our rapidly congested cities.

They developed an artificial intelligence system to tackle the most tedious and computationally demanding urban planning tasks and found that this system produced optimal urban maps that performed about 50% better than human designs in the three criteria of access to services and green spaces and traffic levels. he does.

Starting at a small scale, Zheng and his colleagues instructed their model to map urban areas only a few square kilometers in size.

After two days of training and using multiple neural networks, the AI system sought the ideal road layout and land use to match the 15-minute city concept and local planning policies and needs.

While Zheng and his colleagues’ AI model has some features for planning larger urban areas, the overall design of a city would be infinitely more complex.

Automating urban design and planning processes can significantly save time, researchers say. For example, this AI model calculates certain tasks in seconds, whereas it would take human programmers between 50 and 100 minutes.

According to the researchers, automating the most time-consuming urban planning tasks frees up planners’ time to focus on more challenging or human-centered tasks such as public engagement and aesthetics.

Instead of AI replacing people, Zheng and his colleagues define an AI system as an urban planning assistant that can create conceptual designs that are optimized by algorithms and reviewed, adjusted, and evaluated by human experts based on community feedback.

Commenting on the study, MIT research scientist Paolo Santi wrote: This last step is critical to a good design. Urban planning is not just allocating space to buildings, parks, etc., but designing a place where urban communities can live, work, interact, and hopefully thrive for a long time to come.

By comparing their human-AI workflow with human-specific designs, Zheng and colleagues found that this collaborative process could increase access to basic services and parks by 12 and 5 percent, respectively.

The researchers also surveyed 100 urban designers who did not know whether the designs they were asked to choose from were created by human planners or artificial intelligence. AI received significant votes for some of its space projects, but for others, there was no clear preference among survey participants.

The real test, of course, will be in the communities on which these plans are built, measured by the reductions in noise, heat, and pollution and improvements in public health that urban planning promises.

This study was published in the journal Nature Computational Science.

Continue Reading

AI

Artificial intelligence smells better than humans

Published

on

By

Artificial intelligence smells better than humans

Artificial intelligence smells better than humans. A new artificial intelligence system can analyze odors better than humans, and researchers say it outperformed humans in 53 percent of 400 compounds tested.

Artificial intelligence smells better than humans

When it comes to neuroscience, an important aspect is understanding how our senses translate light into sight, music into hearing, food into taste, and texture into touch. However, information about the sensory relationships of smell has puzzled researchers for a long time.

Humans find the smell of flowers pleasant and the smell of rotten food annoying due to the presence of proteins in the nose called odor receptors. However, little is known about how these receptors absorb chemicals and convert them into aromas.

Read More: The world’s first dental robot started working

To understand this phenomenon, researchers from the Monell Center for Chemical Senses and Osmo, a startup based in Cambridge, Massachusetts, investigated the relationship between the brain’s olfactory perception system and chemicals in the air.

This research led the scientists to develop a machine-learning model that can now verbally describe the smell of compounds with human-level skill.

The details of this study have been published in the journal “Science”.

Extensive effort

There are about 400 active olfactory receptors in humans. These olfactory nerve proteins interact with chemicals in the air to send a signal electrically to the olfactory bulb.

According to these researchers, the number of olfactory receptors is much greater than the four receptors used for color vision or the 40 receptors used for taste.

Joel Maineland, one of the study’s senior authors and a member of the Monell Center, said in a statement: “In olfaction research, the question of what physical properties make the brain perceive the smell of molecules in the air remains a mystery.” So our group worked to understand the relationship between how molecules form and how we perceive their smell. The research group has developed a model that can learn to associate descriptions of a molecule’s odor with the molecular structure of the odor.

A commercial dataset containing the molecular composition and olfactory properties of 5,000 known odorants was used to train the system. The shape of a molecule serves as input to an algorithm that predicts which words can best describe the molecule’s aroma. In addition, to ensure the effectiveness of the model, the researchers performed a blind validation procedure in which a group of trained research participants described the new molecules and then compared their answers to the AI descriptions.

Fifteen participants were each given 400 odorants and instructed to use a set of 55 words—from minty to musty—to describe each molecule.

Impressive results

Finally, it was observed that the artificial intelligence model performs 53% better than humans in describing scents.

The model even performed well on olfactory features for which it was not trained. “It was surprising that we never trained it to learn and describe the strength of smell, but it could still make accurate predictions,” Mainland says.

The model was able to measure a wide range of odor properties, including odor intensity, for 500,000 odor molecules and find hundreds of pairs of structurally different compounds that had similar odors.

“We hope this map will be useful to researchers in chemistry, olfactory neuroscience, and psychophysics as a new tool to investigate the nature of the sense of smell,” says Mainland.

The researchers think that the map emerging from this AI model can also be adjusted based on metabolism, which represents a significant change in the way scientists perceive scents.

In other words, odors that are perceptually similar to each other are likely to share the same metabolic pathway. Now, scientists classify compounds like chemists. For example, by asking whether a molecule has an ester or an aromatic ring.

According to the researchers, this study helps bring the world closer to digitizing smells to record and reproduce them. It could also identify new scents for the fragrance industry, which could not only reduce dependence on endangered plants but also identify new functional fragrances for uses such as mosquito repellent or masking bad smells.

The group then wants to figure out how the smells combine with each other to produce a scent that the human brain perceives as distinctly different from any other scent.

Continue Reading

AI

Artificial intelligence is on the verge of explosion

Published

on

By

artificial intelligence

Artificial intelligence is on the verge of explosion. One of the pioneering leaders in the field of artificial intelligence development, Emad Mostaque, the CEO of Stability AI, claims that this industry is at the beginning of its journey and is about to explode.

Artificial intelligence is on the verge of explosion

During a virtual panel discussion held by Swiss investment bank UBS, Emad Mostaque, CEO of Stability AI, claimed that despite its growing popularity, artificial intelligence is still in its infancy.

He said: [compared to the development of smartphones] we are still at the point of introducing iPhone 2G and 3G. I think next year will be the year [artificial intelligence] takes off.

Read More: The world’s first dental robot started working

Muhammad Emad Mostaque was born in April 1983 in a Bengali Muslim family in Jordan a month after his birth he was taken to Dhaka in Bangladesh and he immigrated to Britain with his family at the age of seven.

In his twenties, he became interested in helping the Muslim world by creating online forums for Muslim communities and developing “Islamic Artificial Intelligence” that would help guide people on their religious journey. Mastak received his master’s degree in mathematics and computer science from the University of Oxford in 2005.

In 2020, he founded the artificial intelligence company Stability AI, which is an imaging company valued at one billion dollars. In recent years, he has been recognized as one of the most influential leaders in the artificial intelligence space.

Mostaque was also one of the experts who signed the famous letter requesting a 6-month freeze on artificial intelligence development along with Elon Musk and several other commentators and has since been very vocal with his opinions on the developing technology.

However, Mostaque is not the only one who believes that artificial intelligence is still in its early days. Michael Briest, head of European technology research at UBS, also claimed that only about 6 percent of earnings statements this year mentioned AI, while about a quarter of companies in the software sector took advantage of it.

If we want to accept Mostaque’s word, this number will increase significantly. He estimates that 50 percent of all CEOs will mention artificial intelligence by next year.

He points to the fact that so far systems such as ChatGPT, Microsoft’s integration of artificial intelligence into the Bing search engine, Google’s introduction of the Bard artificial intelligence chatbot, as well as a text-to-image generation by the artificial intelligence of Stability AI have focused on consumers.

According to Mostaque, when artificial intelligence moves past the consumer, we will see significant growth in its development. “It’s like the calm before the storm because the models aren’t quite ready yet,” he said.

When artificial intelligence finally takes hold at the enterprise level, it will spark intense competition among competing companies in sectors far beyond Silicon Valley. This in turn increases the penetration of artificial intelligence beyond what we have experienced before.

“When your competitors start implementing it, you have to work with it because of the increased productivity and because of the competitive pressure,” said Mostaque. In addition, training artificial intelligence models does not take much time considering the huge amount of data already provided to companies.

He added: “You just have to have and use the right models in the right way to get results that increase productivity.” Emad Mostaque even went so far as to warn that those who ignore the AI revolution will be punished.

He said: You will see that the market will punish those who do not use this [artificial intelligence].

*** AI stands for Artificial Intelligence.

Continue Reading

Popular