Connect with us

Technology

Review of Samsung Galaxy A52s phone, price and specifications

Published

on

Galaxy A52s

Review of Samsung Galaxy A52s phone, review of price, design, screen, software, hardware, camera, battery life and charging speed of Samsung Galaxy A52s phone.

Review of Samsung Galaxy A52s phone, price and specifications

While  we may all look longingly at the Galaxy S phones as a Samsung flagship, the truth is that the A series has always been the Korean company’s thing. And the A5x has been Samsung’s best-seller every year – not the S Ultra, not even the flagship ‘vanilla’ S or its full-size counterpart.

So, in a way, Samsung isn’t really getting more than the A5x – meaning that what most people are really getting is the Samsung experience. For many users, an A-series device is their only glimpse of what a Samsung phone is like, and that obviously has its pros and cons. The latest in the best-selling A-line is the A53, but the most interesting A5x so far is definitely the Galaxy A52s. And that’s why we decided to give it a thorough review.

Samsung Galaxy A52s long-term review

Before the Galaxy A52s, the A5x model du jour was always underwhelming, even for a mid-range smartphone, given its launch price. Obviously, a lot of people put up with it – now either because it was Samsung, or because of the usually better-than-average cameras, or because of the ease of buying anywhere, or because of software support that has improved in recent years. has been found Again and again. Most of the Chinese competitors had a similar price with a better chipset.

And that all changed with the launch of the Galaxy A52s last year. For the first time, an A5x came with a premium mid-range SoC, the Snapdragon 778G, and the Chinese rivals really had no answer, as their best alternatives used the exact same chip. It seems that this was exciting for many, given the amount of interest we’ve seen for the phone since its launch.

And then it got cheaper. And cheaper. And at its current price, it competes well not only in its space in terms of specs, but also in price. That’s practically unheard of for a mid-range Samsung, and so here we have a very attractive package that we can’t help but use as our only smartphone for a long time, and we want to find out if Samsung has chosen a path. To secure its mid-range sales against very aggressive Chinese competitors.

Samsung Galaxy A52s long-term review

2022  has been a strange year for mid-rangers, with most successors to 2021 devices not actually outperforming them in many ways. We’d bet the Galaxy A53 falls squarely in this camp, so perhaps it’s best to think of the A73 as the true successor to the Galaxy A52s, though even that doesn’t come with any upgrades on the chipset front.

So it looks like Samsung can play confusing naming games like other companies. We’re by no means happy about it, but we’re trying to show you if the Galaxy A52s is still a great mid-range buy at this point. 

You can also read about review of Xiaomi Poco X5 Pro phone, price and specifications

Reviewing the design and build quality of the Samsung Galaxy A52s phone

If  you’ve seen a Samsung, or any Samsung made recently, you’ll instantly recognize it as a Samsung. The Korean company has gently perfected a unique and striking design language without feeling overly aggressive and shouty. This year, even the flagship S22 Ultra has a very similar design to what you see on the Galaxy A52s, despite the price delta.

We can’t say we love or hate the design, we think it works very well and is pleasant without being overly beautiful or horrible. It’s middle of the road, but very well executed. You can tell it’s part of Samsung’s lineup, and you can also tell it’s a mid-range Samsung by looking at it or touching it.

Samsung Galaxy A52s long-term review

This  brings us to the first point of contention for many, which is the plastic build of this phone. Yes, the frame and back are plastic. No, we were never upset about it. Sure, it’s warmer to the touch than glass. But on our white version, fingerprints were almost invisible on the back, which is always appreciated. And while any phone is bound to be slippery these days, we found it to be one of the least slippery we’ve handled in a while.

The only small gripe we have with the plastic is the frame of this phone, which is very shiny, although an attempt has been made to make the frame of this phone look like metal, which most people will immediately notice that it isn’t. Sure, it could be highly polished stainless steel, but in a phone this cheap? Definitely not. At best it should be aluminum, and aluminum doesn’t look like that.

Samsung Galaxy A52s long-term review

On  the other hand, the choice of colors is – well, first of all, you have a choice between white, black, purple and “mint”, and we appreciate the latter two for being colorful but still in sharp contrast. to the framework we discussed above. It’s also easy to enjoy the color of your choice in the box, as Samsung doesn’t bother to ship any kind of case with its devices – it prefers that you buy one of the models it offers at extra cost.

This makes the Korean company stand out from the rest, but with the Galaxy A52s, it does well by offering an IP67 certification for water and dust resistance. This means that the phone is dust resistant and can be submerged in 1 meter of water for up to 30 minutes. We wish more mid-range devices would have a similar rating, but for now, Samsung is ahead of the competition.

It’s not a small phone, even by today’s standards, but it’s not a big phone either. That said, it was pretty easy to handle, at least for this reviewer, although the usual caveat applies: if you have small hands, one-handed use might be a problem. Let’s not forget that you have less problems than the S22 Ultra. In terms of weight, it is somewhere in the middle. Not so heavy that you need a constant rest for your hand, but not so light that you feel unimportant. It is true.

Samsung Galaxy A52s long-term review

It certainly  doesn’t feel as “premium” as the metal on the sides and glass on the back, but honestly, it’s pretty decent, not only for its current price, but even its launch price. It does the job and the back doesn’t break if you drop it. win win? 

Checking the speakers and headphone jack of Samsung Galaxy A52s

The Galaxy  A52s offers a 3.5mm headphone jack. So if you want it, this phone has it. We wanted to say more about it, but really – what else is there to say? It exists and it works.

The phone has dual speakers, with a bottom speaker paired with dual headphones as the second channel. It’s a traditional Samsung setup, so there’s nothing inherently wrong with that, especially since at the top end, the Korean company offers some of the loudest speakers on the market. But it’s one of those things where you’re reminded that the Galaxy A52s isn’t a high-end phone, so corners have been cut.

Samsung Galaxy A52s long-term review

Don’t get me wrong, we’re happy to get dual speakers, but they’re not very good. They are tinny (and gradually so the higher the volume) and not really loud even at maximum volume. In a quiet room alone, you’ll have the volume slider up to 90% or higher to easily hear what’s going on in the video you’re watching or the podcast you’re listening to. Around you have to hold the phone close to your ear to understand what is being said.

The sound is also quite flat and lacks any kind of depth, apart from a small amount of bass. And that’s why we haven’t talked about listening to music yet – you really need to use headphones or a Bluetooth speaker for that. So there are dual speakers on this phone, which aren’t terrible for the price, but they’re definitely not amazing either.

Vibration motor

On  the other hand, the vibration motor is really bad. It reminds us of the devices that are usually installed in much cheaper devices, which is probably because it might be the same part. It’s a world apart from the engines some of Samsung’s rivals put in their mid-range, and it’s an area where the Galaxy A52s falls behind most similarly priced devices.

It just sounds very cheap (probably because it is), lacks any kind of sound, and you can hear it more than you feel it – if that makes sense. Oh, and the further you move the vibration intensity slider to the right, the worse it sounds – but if you go the other way, you might not even hear it. Given that you can hardly feel it no matter where the slider is, there are basically no good options to recommend when it comes to setting the intensity.

Vibration settings - Samsung Galaxy A52s long-term review Vibration settings - Samsung Galaxy A52s long-term review Vibration settings - Samsung Galaxy A52s long-term review Vibration settings - Samsung Galaxy A52s long-term review Vibration settings - Samsung Galaxy A52s long-term review Vibration settings - Samsung Galaxy A52s long-term review

There are engines  that you can hear about as often as you can – Samsung used to put these in its flagships until the S22 generation. Then there are the ones that you can feel more than you hear – most people consider these to be the best, although this reviewer believes the former is better, but the point is that when you think of a great vibration motor, Come to think of it, you’ll never think of something as perfect as it is with the Galaxy A52s. Even at the phone’s current price, this is disappointing.

Biometrics

The Galaxy  A52s has an under-display fingerprint sensor that some of its competitors lack. Since all flagship phones these days have an under-display sensor, you’d think this is where the Galaxy A52s punches above its weight. And if we’re just talking about precision, it’s really good. Not quite flagship level, but very, very close.

Subjectively, we think the accuracy is at least 95%, but probably more like 97-98%. I say again, very good, especially at this price. That all changes if your fingers are wet or sweaty, where accuracy drops significantly, but the same goes for any optical sensor like this, regardless of the price of the phone it’s mounted on. has been This is just a limitation of the technology used.

So the accuracy is high, but what’s not good at all is how slow the sensor is. Even with all associated animations disabled (isn’t it funny that Samsung has a setting for that? Almost like engineers know that animations unnecessarily slow down the unlocking process). It’s an optical sensor and not an ultrasonic sensor as seen on the S line, which certainly isn’t the reason for its slowness – we’ve used optical fingerprint scanners from many other companies, in flagships as well as mid-ranges, which is quite a lot. Is. 

Samsung Galaxy A52s long-term review

So  we don’t know what’s going on, but if you’re used to other in-display fingerprint sensors (on non-Samsung mid-range devices), you’ll constantly find yourself swiping up too soon. And the phone will passive-aggressively tell you that you did. This sensor seems to be about half a second slower than other sensors on the market. If this phone was new, we’d hope it was a software issue that could be fixed via a software update, but the Galaxy A52s has been on sale for months and has received many updates, and still It is the same.

That means you’ll probably have to learn to live with it. It’s not that hard to do, but it’s frustrating, especially if you’re looking at competing devices and this kind of slowness doesn’t make sense at this price point. Then again, some come with faster sensors that are less accurate, so maybe it’s a trade-off, like most things.

Biometrics settings - Samsung Galaxy A52s long-term review Biometrics settings - Samsung Galaxy A52s long-term review Biometrics settings - Samsung Galaxy A52s long-term review Biometrics settings - Samsung Galaxy A52s long-term review Biometrics settings - Samsung Galaxy A52s long-term review

If  you’re put off by the fingerprint unlock experience for this or any other reason, there’s also face unlock waiting to be used, and it’s the usual camera-based fare that’s less secure than fingerprint but still Could be a little faster.

Samsung Galaxy A52s screen review

The display is probably one of the best features of the Galaxy A52s, which is understandable given the AMOLED produced by its sister company Samsung Display. This helps the Galaxy A52s stand out among its similarly priced peers because it’s a quality panel. Of course, it won’t match a Samsung flagship in terms of resolution, brightness or quality, but it’s not far from it in any way, which is definitely commendable.

Samsung Galaxy A52s long-term review

The resolution  is basically average for a mid-range smartphone, and probably good enough for flagships, unless you’re the type of person who’s really looking for pixels. We never felt the need for more, and if you did, you’d unfortunately have to pay a premium for a premium device.

The Galaxy A52s display also has a 120Hz refresh rate, which is great but expected even in today’s mid-range. Only OnePlus still thinks it can get away with 90Hz panels in the mid-range, every other company has gone to 120Hz and that’s a huge plus for the user experience. The implementation around the edges is a little rough compared to what we’ve seen on Samsung flagships with 120Hz panels. On the Galaxy A52s, there’s no dynamic change in refresh rates, you can choose between 60Hz and 120Hz and you’ll always get what you choose.

Motion smoothness (refresh rate) setting - Samsung Galaxy A52s long-term review

In  theory, this could hurt battery life, although in practice, as you’ll see in the proper section of this review, it’s still pretty good – we just used the phone set to 120Hz, because we don’t think there’s any have Any tips on getting a screen with a high refresh rate and not trying to use it.

There are still some apps that insist on running at 60Hz no matter what (Google Maps and Camera come to mind), but those are just the exceptions to the above rule. Overall, this display is very smooth and performs very well despite the fact that it is not LTPO and therefore cannot dynamically adjust the refresh rate.

In terms of brightness, it doesn’t reach the heights we’ve seen in flagship smartphones, which is understandable given the price. But the screen is still legible in direct sunlight, even if you occasionally have to blink a bit. Still, it’s one of the brightest panels at this price point, and that says it all.

Samsung Galaxy A52s long-term review

We  also really enjoyed the automatic brightness curve and found that we rarely had to make manual adjustments. Most of the time, the algorithm got it right, and it actually outperformed the much more expensive Galaxy Z Flip3 and Z Fold3 that we recently reviewed long-term. This might actually be the best automatic brightness adjustment algorithm on any mid-range smartphone, at least the ones we got to try. You won’t have a problem at night either, as the screen can be dimmed enough to not hurt your eyes when viewing it in dark environments.

Display settings - Samsung Galaxy A52s long-term review Display settings - Samsung Galaxy A52s long-term review Display settings - Samsung Galaxy A52s long-term review

As  usual with Samsung phones, you can choose between two color profiles. Natural is set for the best sRGB accuracy, and while it doesn’t quite reach the color space, it’s pretty close. The default profile is Vivid, which aims to match the P3 color space, where it does a slightly worse job than normal for sRGB.

Screen mode settings - Samsung Galaxy A52s long-term review Screen mode settings - Samsung Galaxy A52s long-term review Screen mode settings - Samsung Galaxy A52s long-term review

Vivid  is a customizable profile that allows you to adjust the color temperature with a slider. You also get a custom white point setting, so if you want a very specific look for your phone’s screen, you’re sure to find a combination that suits you.

Eye comfort shield

Like  every recent smartphone, the Galaxy A52s also features a blue light filter that Samsung calls Eye Comfort Shield. It’s not as customizable as the Xiaomi, but it gets the job done with a color temperature slider and not much else. It’s also programmable and you can even select an automatic mode where the intensity of the effect changes based on the time of day. That’s the minimum we look for in such a feature and it’s covered.

One oddity is that the filter is turned off for the always-on display and lock screen, presumably because it interferes with fingerprint unlocking — since the optical sensor basically shines light on your finger to read it. That’s fine, but the problem is that once you unlock the phone, the filter usually takes a long time to turn on again – it takes between half a second and two seconds, and the further you have that color temperature slider, the more It becomes noticeable. To the right – the more intense the effect, the clearer it is when lit.

Eye comfort shield settings - Samsung Galaxy A52s long-term review

Speaking  of the always-on display, since it’s an AMOLED panel, it’s there of course, and it’s feature-rich, unlike some mid-range rivals that don’t always really perform well. It’s also highly customizable, although it lacks some of the more advanced features found in the likes of MIUI.

Always On Display settings - Samsung Galaxy A52s long-term review Always On Display settings - Samsung Galaxy A52s long-term review Always On Display settings - Samsung Galaxy A52s long-term review Always On Display settings - Samsung Galaxy A52s long-term review Always On Display settings - Samsung Galaxy A52s long-term review

That said  , it probably works pretty well for most people, and lets you choose from a variety of analog and digital clocks, images from the gallery, stickers, Bitmoji, and more. You can choose its direction and schedule it too, and control how much it stays on – it ranges from fully on to when you tap the screen or get a new notification. Overall, this feature, which has become a must-have for many people, is very nicely done.

Show related niggles

As  we mentioned earlier, while the display itself is one of the best parts of the Galaxy A52s, it also houses one of the strangest parts, which is the selfie camera embedded in a hole in the center. We don’t mean the camera itself, but the pointless silver ring around it that absorbs light in various ways and reflects it back to you.

We thought the whole point of hole-punch selfie cameras was to make them as unobtrusive as possible, but this ring negates a lot of that benefit for no apparent reason. You can easily get used to it, of course, we just can’t understand who thought it would be a good idea to add it and why. Samsung isn’t alone in this, though – we’ve seen a lot of similar implementations from some of its rivals over the past few years, and we’ve been just as confused when dealing with them.

Samsung Galaxy A52s long-term review

Since we’re complaining, let’s also mention the existing or very poor anti-fingerprint coating on the screen. Our unit is a retail unit of the Galaxy A52s, so this is exactly the experience any buyer would get if they bought one, and the screen pops up easily in minutes using a fingerprint. If you don’t like the greasy look of fingerprints on your phone’s screen, you’ll want to always have a microfiber cloth ready, that’s for sure.

Of course, this problem becomes completely irrelevant if you use a screen protector, as it all depends on that screen protector’s anti-fingerprint coating to protect you from the aforementioned messes. And most good glasses actually have better coverage than the Galaxy A52s. That is, if there is even one – we cannot say. It’s just one of the things that detracts from the feel of using the phone, and we’re left wondering how much it costs Samsung to use the same kind of coating as its flagships.

Performance review of Samsung Galaxy A52s

The Galaxy A52s  is a special case where it’s hard to talk about performance and smoothness separately, regardless of how subjective the latter might be. If you’re interested in raw performance numbers from benchmarks, you should definitely take a look at our regular review, which has a lot of them. In long-term reviews, we avoid cold numbers and try to describe how they feel in real day-to-day use.

Putting smoothness aside for just a second, the Galaxy A52s performs perfectly for the price. However, this chipset gets its power a bit above its current price, and the Snapdragon 778G is miles ahead of the 765G and 750G of yesteryear. The numbers don’t paint an accurate picture, as the jump from those older SoCs to this one is much more significant. The 778G is practically an “almost flagship” chipset, if by “flagship” you mean something like the 870.

However, the Galaxy A52s feels slower than other devices powered by the same chipset. We’re assuming this is due to insufficient software optimization, but we can’t know for sure. It’s certainly not slow for a mid-ranger, but it’s as fast as a Snapdragon 720G/730G/732G-equipped Redmi, and that’s no good when you consider that on paper the 778G should be much more powerful than those.

Samsung Galaxy A52s long-term review

Everything  works on this phone, but a little slower than we expected. When it comes to speed, it is very similar to the Redmi Note 10 Pro and should really be head and shoulders above it. But it is not. Recently, we’ve praised Samsung for fixing its softness issues with high-end devices like the Galaxy Z Flip3 and Fold3, which are now almost imperceptibly smoother than their top Chinese rivals, but it’s clear the Korean company hasn’t. It applied the same attention to detail when it came to software optimization for the Galaxy A52s.

And that’s a real shame because with the Snapdragon 778G chipset and a lot more tweaks, it had the potential to be one of the best buys in the entire industry without too many caveats. As it is, the main caveat with this phone is the fact that it’s slower than it should be considering the hardware, and it’s anything but smooth.

There’s a lot of lag everywhere, reminding us of the (not so) glory days of TouchWiz and Samsung Experience, before One UI was called One UI. If you open it and try to navigate through it immediately, transitions are laggy, animations are laggy, app drawer is laggy. It settles after a few seconds of opening, and then the lag goes away, but it’s not the behavior we’d expect with a near-flagship chip inside. The Google Discover feed, which you can have on the left side of your home screen, is a mess, and Samsung Free, the Korean company’s alternative, is even worse.

Samsung Galaxy A52s long-term review

There’s also  a glitch in interpreting swipes, which means that sometimes, when you’re trying to scroll horizontally through your recent apps, you end up closing one because the phone interprets your swipe as vertical. Pressing the power button to show the lock screen lags about 70% of the time, and sometimes it’s even buggy in that it takes you straight to the home screen without fingerprint or face authentication. This only happened to us twice over as many weeks with the Galaxy A52s, but it seems like a security issue enough to merit attention.

The Snapdragon 778G chipset was on paper the Galaxy A52s biggest upgrade over its predecessor. However, in actual use, while there are improvements in speed and smoothness, they’re nowhere near what we’d expect based on the hardware. We have a feeling that most of the issues described here are related to the software implementation, meaning they could theoretically be fixed via an update, but the phone has been out of stock for many months now. A lot has been offered. 

Don’t get me wrong, this is probably the best performing device in the A series (probably tied with the A73), but it’s passable. Performance and affordability have always been areas where the A-series has fallen behind the competition, and while the A52s makes up some of the difference, it’s still nothing short of amazing. it could be better.

Checking battery life and charging speed of Samsung Galaxy A52s

Battery life  has been good, if not record-breaking, during our time with the Galaxy A52s. In our usage, detailed below, we never had to worry about not making it through the day on a single charge. It was a one-day smartphone for us, with some reserve at the end of the day, but not enough to even see us through the middle of the next day. While some people want a multi-day battery life from the phone, we feel a day is enough, as you can always charge overnight.

And charging at night is what you’re likely to do with the Galaxy A52s, as charging is very slow, meaning charging at midday can be a frustrating experience. So hopefully you don’t spend all day on mobile data with a weak signal only to need a lot of it. Samsung is lagging behind in the fast charging game, and by a lot. The Galaxy A52s 4,500mAh battery takes an hour and a half to fully charge, which is 50-100% longer than most of its competitors. It’s about Samsung’s 25W charger, which thankfully comes in the box this time – the Galaxy A52 was capable of 25W charging, but only came with a 15W charger.

It’s not the case that Samsung really needs to get charging together, whether it’s for flagships or mid-rangers like this. Wireless charging isn’t offered here, but it’s the same price point for this segment, so we don’t think it’s a huge omission.

Battery life samples - Samsung Galaxy A52s long-term review Battery life samples - Samsung Galaxy A52s long-term review Battery life samples - Samsung Galaxy A52s long-term review Battery life samples - Samsung Galaxy A52s long-term review

Now  back to battery life, the screenshots above are snapshots of our experience over various days with 12-16 hours off the charger, with the initial Wi-Fi connection, about an hour or so on 5G, Bluetooth always on and connected to TWS will show you. Headphones for about two hours for calls and listening to music or podcasts. The location is also always on and there is about half an hour of GPS navigation with Waze or Google Maps daily. With similar usage, these screenshots show what you can expect, but keep in mind that any variation in usage will result in different numbers.

Checking the life of Samsung Galaxy A52s phone software

The Galaxy  A52s currently runs One UI 4.1, the latest version of Samsung’s skin on top of Android 12. So even though it’s a mid-ranger, it has the same software as the Korean company’s high-end devices. This is a big advantage. But while you get all the features that One UI 4.1 has to offer, you don’t get the smooth performance of the Galaxy S or Fold or Flip.

Samsung Galaxy A52s long-term review

As  we detailed on the previous page in the performance section of this review, this is the part where you will definitely feel that it is far from a premium device. This is not a situation we are happy with, but it is what it is. At least you have a full One UI 4.1 interface to work with, and not the Core version that lower-end A models have to make do with.

Updates

During  our recent long-term reviews of Samsung devices, we’ve consistently praised the company for making big strides in the software update game, and we’ll reiterate that it’s come a long way in recent years. . From being one of the slowest updates, it’s now among the fastest, and that’s even for mid-range phones like the Galaxy A52s, not just flagships.

It was updated to Android 12 with One UI 4.0 in January, and a few months later, in March, it received the One UI 4.1 update – remember that One UI 4.1 only came with the S22 family in February. For any other Android device manufacturer (with the exception of Google itself), this turnaround time would be unprecedented.

For Samsung, it’s just par for the course these days, and that’s refreshing to see, as is the company’s promise of three years of major Android updates and four years of security patches for the A52s. This is more than the flagships of many competitors! The current version of One UI 4.1 on our Galaxy A52s review unit has a security patch level of June 1, 2022, and thus is by no means outdated (take note, Xiaomi!).

Current software - Samsung Galaxy A52s long-term review Current software - Samsung Galaxy A52s long-term review

However  , all is not rosy – the initial Android 12 update with One UI 4.0 introduced a lot of bugs. We haven’t seen too many of them in the current build – the only ones we have to deal with are listed on the previous page in the performance section. But it goes to show once again that while mid-rangers may receive updates almost as quickly as flagships, they don’t receive the same care and quality control. At least not yet – hopefully this will change in the future.

Attributes

The skin  has evolved a lot since its inception, but it still looks and feels like a regular Samsung phone, with countless options and settings for anything you can possibly imagine (and many things we We bet you’ve never thought of that). It’s clear that the ‘more is more’ design philosophy works in Korea, and if you enjoy long trips to the settings, the Galaxy A52s and its software will fully cater to you. You can easily spend hours there and go through every nook and cranny.

Settings - Samsung Galaxy A52s long-term review Settings - Samsung Galaxy A52s long-term review Settings - Samsung Galaxy A52s long-term review Settings - Samsung Galaxy A52s long-term review

The skin  has evolved a lot since its inception, but it still looks and feels like the usual Samsung fare, with countless options and settings for anything you could possibly imagine (and many things we We bet you’ve never thought of that). It’s clear that the ‘more is more’ design philosophy works in Korea, and if you enjoy long trips to the settings, the Galaxy A52s and its software will fully cater to you. You can easily spend hours there and go through every nook and cranny.

Lock screen notification settings - Samsung Galaxy A52s long-term review Lock screen notification settings - Samsung Galaxy A52s long-term review

This  has been a long-standing problem of ours with One UI, and it still remains unchanged. The same goes for other Android skins, if you long press the power button, you’ll get a power menu with options to reboot or shut down your device. Not on Samsung though – you’ll get Bixby by default. Once again this can easily be changed back to what it should have been – but it’s only easy if you know it’s possible. It doesn’t help that the settings menu calls the power button the “side key” for some reason. 

Side key settings - Samsung Galaxy A52s long-term review

Bixby’s continued  existence, despite its apparent inferiority to the existing Google Assistant, ties into another common theme with Samsung phones – repetitiveness. You have a lot of Samsung-made apps that feel like they were created just to copy Google’s existing apps (which in most, but not all, cases are far superior). And then the epitome of the whole thing is the fact that you have two app stores on the Galaxy A52s, because Samsung can’t afford not to have one. 

If you’ve used another Samsung in recent years, you’re no doubt used to these shenanigans, which don’t make the situation confusing for newcomers to the brand, nor do they make sense of the situation. The end user either way, but if you could ask Samsung, they’d probably say something about how it’s nice to have more options.

And this brings us to the “more is more” philosophy. Love it or hate it, it’s here to stay — and in some ways we don’t care, because it seems like the opposite of what Apple likes to do, and having such opposites in the market is a bonus. It is for variety. A choice we are all for

dark mode

One UI 4.1  comes with all the features you’ve come to expect from an Android skin in 2022. It has a dark mode that does the job well enough without overdoing the settings. You can turn it on from dusk to sunrise or with a custom time range, and that’s it. There’s no control over how dark it goes, and no forcing a dark mode on apps that don’t have one of their own. Both of these options are available in other Android skins, but not here.

Dark mode and gesture navigation settings - Samsung Galaxy A52s long-term review Dark mode and gesture navigation settings - Samsung Galaxy A52s long-term review Dark mode and gesture navigation settings - Samsung Galaxy A52s long-term review

Gesture navigation is also supported on this phone, and it works well. That said, at the default gesture sensitivity settings, we found that our swipes up to go home were interpreted as scrolling most of the time, leading to a lot of frustration. A sensitivity slider to the lowest or highest position fixes this problem, although we can’t say we understand why either extreme would achieve the same effect. However, if you’re bothered by swiping up to go home being misinterpreted as vertical scrolling, play with that slider – it should fix your problems.

Launcher, wallpapers

The  launcher is fairly basic in terms of customization, and it’s strange when it comes to the app drawer, which moves horizontally, like on home screens, and not vertically, like other app drawers. Since you swipe up to reveal the app drawer, we think it makes more sense to scroll vertically, but that might just be a preference. Similarly, the fact that the drawer is, by default, seemingly random rather than alphabetically sorted.

Sure, there’s a setting you can change to get back to normal Android behavior, because of course there is. But even then, folders (yes, the app drawer has folders for a reason) don’t stick to it and are always shown first, because Samsung engineers probably think of the app drawer as a file manager on a computer. We don’t know for sure, but what else can we assume given this behavior?

App drawer with folders and sorting options - Samsung Galaxy A52s long-term review App drawer with folders and sorting options - Samsung Galaxy A52s long-term review

Next  , if you enjoy delayed scrolling, you can add the Google Discover or Samsung Free… feed to the left side of your home screen. Samsung Free is by far the longest part of the UI, so maybe it’s good to have the experience just for that? We didn’t find any other use for it anyway.

Home screen, Google Discover, Home screen settings - Samsung Galaxy A52s long-term review Home screen, Google Discover, Home screen settings - Samsung Galaxy A52s long-term review Home screen, Google Discover, Home screen settings - Samsung Galaxy A52s long-term review Home screen, Google Discover, Home screen settings - Samsung Galaxy A52s long-term review Home screen, Google Discover, Home screen settings - Samsung Galaxy A52s long-term review Home screen, Google Discover, Home screen settings - Samsung Galaxy A52s long-term review

The recent apps view  is thankfully what you’d expect from an Android skin, with a horizontally scrolling list of app icons that you can swipe through. Below these, by default, are four icons for suggested apps that the phone thinks you want to access. Their algorithm was already better in previous iterations of One UI, but we still found it decent on the Galaxy A52s. It managed to guess the app we were looking for about 75% of the time, which isn’t outstanding, but it’s not bad either. And when it does it right, it saves seconds of scrolling.

Recents and Recents settings - Samsung Galaxy A52s long-term review Recents and Recents settings - Samsung Galaxy A52s long-term review

Since  this is Android 12, the UI color scheme can be pulled from your chosen wallpaper, and you can choose from a few options – Samsung couldn’t be Samsung here and offer additional color palettes, while Google just automatically creates one at a time. When you change your wallpaper so this method requires an extra click here because you also have to choose the palette you want. Although it’s more complicated than it should be, it’s still nice to have this super quick and easy way to customize UI colors and basically have a fresh new theme every time you get a new wallpaper.

Wallpaper settings - Samsung Galaxy A52s long-term review Wallpaper settings - Samsung Galaxy A52s long-term review Wallpaper settings - Samsung Galaxy A52s long-term review Wallpaper settings - Samsung Galaxy A52s long-term review

Speaking of  which, the Galaxy Store has a wide selection of them, both free and paid, which is great because the ones they include aren’t much to write home about. There are also not many of them. There’s an option to have a different lock screen wallpaper (from multiple categories) every time you unlock the phone, which is great, but we still can’t understand why you can’t have a similar system for the home screen wallpaper. When you want to make a change, you still have to apply it manually.

Other features

One  UI has a few built-in “ecosystem” features, like Continue apps on other devices, that sync data across Samsung products but only work with a small subset of apps. Calling and texting on other devices is also available, which can be very useful if you have multiple Samsung phones. There’s also a link to Windows, which is useful if you have a Windows device and don’t want to pick up your phone while you’re working.

Other features - Samsung Galaxy A52s long-term review Other features - Samsung Galaxy A52s long-term review Other features - Samsung Galaxy A52s long-term review Other features - Samsung Galaxy A52s long-term review

A  special lab area in Settings allows you to force multi-window functionality to all apps, as many still don’t support it. You can even hide the status bar in Split Screen view (and the navigation bar if you’re using it instead of gestures) to gain some display real estate – but if you do, you’ll have to swipe down once. Swipe to see the status bar and then swipe again to get the notification panel.

Samsung Galaxy A52s camera review

The rear camera setup of the Galaxy A52s stands out compared to some of its competitors due to the presence of optical image stabilization (OIS) on its main camera. It also has a higher resolution than usual for this ultra-wide shooter at the price, so we were very curious to test these out.

The main camera produces decent images in daytime conditions, which are decent for the price, but not really outstanding in any way. There’s a fairly wide dynamic range, and a very pervasive ‘Samsung look’ throughout, with high contrast and sharpness, as well as colors that sometimes pop a bit too much.

Samsung Galaxy A52s long-term review

People seem  to love this look for sharing on social media, so we understand why, but it still feels a little too much for our eyes in some photos. While these images aren’t bad by any means for the price, we have to admit that given Samsung’s extensive experience in making phones with great cameras, we were expecting a little from the Galaxy A52s. Alas, they are still limited to higher prices.

Daytime samples from the main camera - f/1.8, ISO 25, 1/1439s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/1552s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/785s - Samsung Galaxy A52s long-term review
Daytime samples from the main camera - f/1.8, ISO 25, 1/292s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/195s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/584s - Samsung Galaxy A52s long-term review
Daytime samples from the main camera - f/1.8, ISO 25, 1/430s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/303s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/789s - Samsung Galaxy A52s long-term review
Daytime samples from the main camera - f/1.8, ISO 50, 1/100s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/1074s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/923s - Samsung Galaxy A52s long-term review
Daytime samples from the main camera - f/1.8, ISO 25, 1/1279s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/385s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/1770s - Samsung Galaxy A52s long-term review
Daytime samples from the main camera - f/1.8, ISO 25, 1/250s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/257s - Samsung Galaxy A52s long-term review Daytime samples from the main camera - f/1.8, ISO 25, 1/130s - Samsung Galaxy A52s long-term review

The ultra-wide camera  also produces good images. We appreciate the extra clarity compared to what most competitors offer in their 8MP ultra-wide cameras. Softness is visible around the edges, as you’d expect from a non-flagship ultra-wide, and the color science seems quite different compared to the original camera. It’s not really noticeable unless you look for it, but you’ll see it when you do. This snapper certainly won’t win any awards, but it’s good and reliable at its job, and probably better than any 8MP ultra-wide on any competing device.

Daytime samples from the ultrawide - f/2.2, ISO 50, 1/2027s - Samsung Galaxy A52s long-term review Daytime samples from the ultrawide - f/2.2, ISO 50, 1/1812s - Samsung Galaxy A52s long-term review Daytime samples from the ultrawide - f/2.2, ISO 50, 1/347s - Samsung Galaxy A52s long-term review
Daytime samples from the ultrawide - f/2.2, ISO 50, 1/216s - Samsung Galaxy A52s long-term review Daytime samples from the ultrawide - f/2.2, ISO 50, 1/571s - Samsung Galaxy A52s long-term review Daytime samples from the ultrawide - f/2.2, ISO 50, 1/376s - Samsung Galaxy A52s long-term review
Daytime samples from the ultrawide - f/2.2, ISO 50, 1/1024s - Samsung Galaxy A52s long-term review Daytime samples from the ultrawide - f/2.2, ISO 50, 1/337s - Samsung Galaxy A52s long-term review Daytime samples from the ultrawide - f/2.2, ISO 50, 1/1158s - Samsung Galaxy A52s long-term review
Daytime samples from the ultrawide - f/2.2, ISO 50, 1/610s - Samsung Galaxy A52s long-term review Daytime samples from the ultrawide - f/2.2, ISO 50, 1/3604s - Samsung Galaxy A52s long-term review Daytime samples from the ultrawide - f/2.2, ISO 50, 1/323s - Samsung Galaxy A52s long-term review

While  the Galaxy A52s lacks a proper telephoto camera (like almost all phones at this price point), it’s unsurprising that there’s a double-shift in the viewfinder. Messing with this uses the native camera’s clippings, and the results can sometimes be very soft. They are usable in small mode, but we probably wouldn’t recommend shooting in this mode too much, as the delta quality is quite noticeable compared to 1x shots. On the other hand, you still get a lot of detail and relatively low noise.

Daytime zoom samples - f/1.8, ISO 25, 1/1531s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/1454s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/1344s - Samsung Galaxy A52s long-term review
Daytime zoom samples - f/1.8, ISO 25, 1/566s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/649s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/190s - Samsung Galaxy A52s long-term review
Daytime zoom samples - f/1.8, ISO 25, 1/714s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/280s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/789s - Samsung Galaxy A52s long-term review
Daytime zoom samples - f/1.8, ISO 32, 1/50s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/950s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/1260s - Samsung Galaxy A52s long-term review
Daytime zoom samples - f/1.8, ISO 25, 1/113s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/1517s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/1025s - Samsung Galaxy A52s long-term review
Daytime zoom samples - f/1.8, ISO 25, 1/526s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/798s - Samsung Galaxy A52s long-term review Daytime zoom samples - f/1.8, ISO 25, 1/279s - Samsung Galaxy A52s long-term review

At  night, the main camera produces decent photos with good detail and relatively low noise. However, in many scenes, these can become too obvious.

Nighttime samples from the main camera - f/1.8, ISO 1600, 1/25s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 500, 1/50s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 800, 1/25s - Samsung Galaxy A52s long-term review
Nighttime samples from the main camera - f/1.8, ISO 1250, 1/25s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 2000, 1/20s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 1250, 1/25s - Samsung Galaxy A52s long-term review
Nighttime samples from the main camera - f/1.8, ISO 1600, 1/25s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 500, 1/50s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 800, 1/25s - Samsung Galaxy A52s long-term review
Nighttime samples from the main camera - f/1.8, ISO 2000, 1/20s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 2500, 1/13s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 1000, 1/25s - Samsung Galaxy A52s long-term review
Nighttime samples from the main camera - f/1.8, ISO 2500, 1/13s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 2000, 1/17s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 2500, 1/15s - Samsung Galaxy A52s long-term review
Nighttime samples from the main camera - f/1.8, ISO 1600, 1/24s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 1250, 1/24s - Samsung Galaxy A52s long-term review Nighttime samples from the main camera - f/1.8, ISO 1250, 1/25s - Samsung Galaxy A52s long-term review

Using  night mode helps restore highlights and results in even more detailed photos, but sharpening is even more aggressive than auto mode, and shadows can be crushed.

Night Mode samples from the main camera - f/1.8, ISO 800, 1/14s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 320, 1/33s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 500, 1/17s - Samsung Galaxy A52s long-term review
Night Mode samples from the main camera - f/1.8, ISO 1000, 1/14s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 640, 1/14s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 1000, 1/9s - Samsung Galaxy A52s long-term review
Night Mode samples from the main camera - f/1.8, ISO 500, 1/14s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 1000, 1/11s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 1250, 1/8s - Samsung Galaxy A52s long-term review
Night Mode samples from the main camera - f/1.8, ISO 1250, 1/8s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 1000, 1/11s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 1000, 1/8s - Samsung Galaxy A52s long-term review
Night Mode samples from the main camera - f/1.8, ISO 1000, 1/13s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 1000, 1/15s - Samsung Galaxy A52s long-term review Night Mode samples from the main camera - f/1.8, ISO 640, 1/14s - Samsung Galaxy A52s long-term review

The ultra-wide camera  struggles in low light, but much less so than the 8MP sensors typically found in smartphones at this price point. The photos it produces are noticeably softer than those from the original sensor, and also have less detail, but still look barely usable most of the time – at least if there are light sources around. Dynamic range is also better than we expected, though we didn’t win any awards.

Nighttime samples from the ultrawide - f/2.2, ISO 2000, 1/10s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 400, 1/25s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 800, 1/13s - Samsung Galaxy A52s long-term review
Nighttime samples from the ultrawide - f/2.2, ISO 800, 1/10s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 1600, 1/10s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 1250, 1/10s - Samsung Galaxy A52s long-term review
Nighttime samples from the ultrawide - f/2.2, ISO 400, 1/25s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 640, 1/17s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 800, 1/10s - Samsung Galaxy A52s long-term review
Nighttime samples from the ultrawide - f/2.2, ISO 1600, 1/10s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 2500, 1/10s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 800, 1/14s - Samsung Galaxy A52s long-term review
Nighttime samples from the ultrawide - f/2.2, ISO 2000, 1/10s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 1000, 1/10s - Samsung Galaxy A52s long-term review Nighttime samples from the ultrawide - f/2.2, ISO 1250, 1/10s - Samsung Galaxy A52s long-term review

Using  Night Mode generally enhances shadows and restores highlights, at the expense of over-sharpening in some scenes, which can result in artifacts that may render the image unusable. Therefore, it is not very easy to recommend which mode for night photography, because both automatic and night mode have advantages and disadvantages.

Night Mode samples from the ultrawide - f/2.2, ISO 800, 1/6s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 400, 1/25s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 640, 1/17s - Samsung Galaxy A52s long-term review
Night Mode samples from the ultrawide - f/2.2, ISO 640, 1/6s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 640, 1/11s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 800, 1/5s - Samsung Galaxy A52s long-term review
Night Mode samples from the ultrawide - f/2.2, ISO 640, 1/5s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 800, 1/25s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 640, 1/5s - Samsung Galaxy A52s long-term review
Night Mode samples from the ultrawide - f/2.2, ISO 1250, 1/4s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 1250, 1/4s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 800, 1/4s - Samsung Galaxy A52s long-term review
Night Mode samples from the ultrawide - f/2.2, ISO 1000, 1/4s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 640, 1/11s - Samsung Galaxy A52s long-term review Night Mode samples from the ultrawide - f/2.2, ISO 800, 1/9s - Samsung Galaxy A52s long-term review

2x shots  at night have a lot more noise than 1x shots, and are also often accompanied by sharp artifacts, otherwise not too bad. Good for quick social media sharing if needed.

Nighttime zoom samples - f/1.8, ISO 800, 1/25s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 400, 1/33s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 640, 1/25s - Samsung Galaxy A52s long-term review
Nighttime zoom samples - f/1.8, ISO 800, 1/14s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 2000, 1/17s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 1250, 1/25s - Samsung Galaxy A52s long-term review
Nighttime zoom samples - f/1.8, ISO 1000, 1/25s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 500, 1/50s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 500, 1/25s - Samsung Galaxy A52s long-term review
Nighttime zoom samples - f/1.8, ISO 1600, 1/25s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 2500, 1/14s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 500, 1/25s - Samsung Galaxy A52s long-term review
Nighttime zoom samples - f/1.8, ISO 2000, 1/25s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 2000, 1/20s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 2000, 1/17s - Samsung Galaxy A52s long-term review
Nighttime zoom samples - f/1.8, ISO 1250, 1/24s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 2000, 1/24s - Samsung Galaxy A52s long-term review Nighttime zoom samples - f/1.8, ISO 640, 1/25s - Samsung Galaxy A52s long-term review

Using  night mode for these usually results in a watercolor-like effect, as all the aggressive processing proves too much to do. You’ll get a (barely) usable photo though, but otherwise, if you don’t like the watercolor look, it’s best to avoid this mode when zooming.

Night Mode zoom samples - f/1.8, ISO 500, 1/33s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 400, 1/33s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 400, 1/20s - Samsung Galaxy A52s long-term review
Night Mode zoom samples - f/1.8, ISO 800, 1/14s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 640, 1/14s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 400, 1/33s - Samsung Galaxy A52s long-term review
Night Mode zoom samples - f/1.8, ISO 500, 1/25s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 1000, 1/11s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 1000, 1/8s - Samsung Galaxy A52s long-term review
Night Mode zoom samples - f/1.8, ISO 1000, 1/13s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 1000, 1/11s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 1000, 1/9s - Samsung Galaxy A52s long-term review
Night Mode zoom samples - f/1.8, ISO 640, 1/15s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 1000, 1/11s - Samsung Galaxy A52s long-term review Night Mode zoom samples - f/1.8, ISO 500, 1/17s - Samsung Galaxy A52s long-term review

Selfies  can be taken in two ‘field of view’ modes, narrow and wide, with the former being the default, although you can change it if you don’t like it. You probably won’t, because it’s a little too close for comfort, and if you’re trying to fit more than one person into a shot, you can forget about it. This is where wide mode comes in handy.

The pictures are good, have a lot of detail and not a lot of noise. Skin tones aren’t always accurate, but otherwise these shots are fine during the day. At night that all changes, and even with the flash filling the screen, you’ll get dark and noisy photos unless you’re around a light source. If you find one, you’ll obviously get worse quality than the day, but you might still be able to get a few usable selfies to send your friends.

Selfies day and night, normal/wide - f/2.2, ISO 64, 1/1375s - Samsung Galaxy A52s long-term review Selfies day and night, normal/wide - f/2.2, ISO 64, 1/1233s - Samsung Galaxy A52s long-term review Selfies day and night, normal/wide - f/2.2, ISO 64, 1/1246s - Samsung Galaxy A52s long-term review
Selfies day and night, normal/wide - f/2.2, ISO 64, 1/1203s - Samsung Galaxy A52s long-term review Selfies day and night, normal/wide - f/2.2, ISO 4000, 1/33s - Samsung Galaxy A52s long-term review Selfies day and night, normal/wide - f/2.2, ISO 4000, 1/33s - Samsung Galaxy A52s long-term review

Overall  , the Galaxy A52s’ camera system is good for its price point, with a better ultra-wide than most competitors, and a decent main sensor that adds peace of mind OIS for longer exposures in night mode. It’s not a bad flagship camera, but in terms of quality, it’s not always head and shoulders above the crowd at a similar price point.

We should also mention that during the several weeks of using the phone for this long-term review, we never encountered any issues with the camera app. Although it’s slow to change sensors and sometimes save your shots, it never crashes or freezes, so the shooting experience isn’t disrupted.

Summary

At its current asking price, the Galaxy  A52s is a great value proposition, perhaps the best value any A-series device has ever had. So if you want a mid-range Samsung and don’t want to feel like you’ve paid more, this is probably the option to go for even today with its supposed successor, the A53, available. As we mentioned earlier, depending on the chipset used, it might be better to consider the A73 as the true successor to the A52s, but that’s up to you. Either way, they’re both already more expensive than the Galaxy A52, while not offering much to justify the price difference.

The Galaxy A52s comes with Samsung’s extended software update promise of three years of major Android updates and four years of security updates, which is currently unmatched in the mid-range space outside of Google and Apple. So, if a long update window is important to you and you don’t want to spend flagship-level money, then again, this is the option to go for.

Samsung Galaxy A52s long-term review

With  all that said, the Galaxy A52s is definitely not the complete package, even for the price, and while updates are delivered on time and reliably, some have introduced bugs in the past, so not everything is up to snuff. . And while we’re on the subject of software, One UI 4.1 on the A52s looks and feels the same as One UI 4.1 on Samsung’s flagship, but it doesn’t feel the same.

Some of that is undoubtedly due to the inferior chipset, but not all of it – the same Snapdragon 778G seems faster in some competing devices. Our obvious suspicion here is that Samsung simply isn’t optimizing the A series software as much as it is now finally doing for the S line and its foldables, and that’s an area where things will definitely need to be addressed in the future. It will improve. It’s a shame that reasonably capable hardware is paired with a relatively poor software experience.

The Galaxy A52s’ battery capacity won’t break any records, but in real-world use it reliably lasted us a full day, never worrying about running out of power on a single charge. That’s good, because slow charging doesn’t exactly make midday fast charging very useful. It’s nice to see a Samsung device still come with a charger in the box, and it actually supports 25W charging, which is the maximum the phone can do – unlike the A52, which was able to It was 25 watts, but it had a 15-watt charger. inside the box Charging speed is another area where improvements have been delayed for Samsung devices (regardless of their price).

Samsung Galaxy A52s long-term review

The  screen is perhaps the best part of this phone, and that’s great because the screen is what you look at when you use it. This is a high quality panel that is among the best (if not the best) you can find at this price. The 120Hz refresh rate is welcome, and if you’re not expecting flagship brightness levels, you’ll find that it’s good enough to be discernible even on a sunny summer day. Only just, but still.

The cameras are generally good, with the ultra-wide being surprisingly well-closed, above the phone’s price, and consistently delivering better results than the competition. The main camera isn’t bad but it’s pretty much what you’d expect from a mid-ranger, the only thing that stands out is the presence of OIS, which is still rare at this price.

Performance and smoothness are a bit lower than we expected, if you’ve ever used the Redmi Note 10 Pro or any other Redmi with a Snapdragon 720G/730G/732G, things here will be incredibly familiar – but with 5G support. Added, which is much less of a differentiator today than it used to be, and yet it’s still nice to have.

Samsung Galaxy A52s long-term review

At  its current price, we don’t think we’d call this Samsung’s mid-range smartphone the best mid-range smartphone ever, but with more attention and attention to the software experience, it could be the best mid-range smartphone of 2022.

As it is, there are competitors that feel faster and smoother, so ultimately it’s up to you whether you care about the A52s features that those competitors can’t match: extensive software support, screen quality, OIS . On the main camera, ultra-wide image quality, IP67 water and dust resistance, and even branding on the back, because some people don’t want to stray from the Apple-Samsung duopoly.

Source: GSMARENA.COM

Technology

iPhone 16 Pro Review

Published

on

By

iPhone 16 Pro
The iPhone 16 Pro is one of the least changed iPhones of the last few years, and at the same time, it offers the same reliable experience as before.

iPhone 16 Pro Review

We usually know Apple as a company that refuses to release half-assed products or software features and prefers not to enter a new field at all or to enter with a product that provides a reliable and efficient experience to the user. Accordingly, the iPhone 16 Pro is the most imperfect product in Apple’s history; I will explain further.

Table of contents
  • iPhone 16 Pro video review
  • Camera and Camera Control
  • Ultrawide camera
  • Main camera
  • Telephoto camera
  • Portrait photography
  • selfie camera
  • Performance and battery
  • Design and build quality
  • Display and speaker
  • Summary and comparison with competitors

Apple is marketing the iPhone 16 Pro with a focus on Apple Intelligence and its artificial intelligence capabilities; But now, even to experience Apple’s artificial intelligence half-and-half, you have to wait until the official release of iOS 18.1 in late October, more than a month after the iPhone 16’s launch. There is not even news of the attractive animation of the new Siri; The animation that inspired Apple to name the iPhone 16 event It’s Glowtime.

Dimensions of iPhone 16 Pro in hand

For those who have been unaware of the technology world since the early months of 2024, I must say that Apple Intelligence is Apple’s answer to Google’s Gemina, Samsung’s Galaxy AI, and even Microsoft’s Copilot. According to Apple Intelligence, Siri is going to be what was promised 13 years ago, during its unveiling; A full-fledged digital assistant that speaks to the user in natural language; Of course, apart from the advanced Siri, capabilities such as creating photos and emojis with AI, text writing and photo editing tools will also be added to iOS.

Note that we have to wait for iOS 18.4 to fully experience Apple Intelligence with all its features; This update will be released in the early months of 2025. iPhone 16 comes with iOS 18 by default; So it is not surprising that Apple lags behind its competitors with such a delay, and the iPhone 16 Pro is not a perfect device either.

Camera and Camera Control

Now that Apple Intelligence is out of the question, and as per Zoomit’s policy, we don’t review a device based on the promise of future updates, let’s leave AI out of the iPhone 16 Pro review headlines and start straight from the part that has changed the most. : Camera or rather, camera button.

Control camera button on iPhone 16 Pro frame
Working with iPhone 16 Pro camera control
iPhone 16 Pro camera control menu
iPhone 16 Pro cameras

While it was said that Apple is working on removing the physical buttons of the iPhone, this year surprisingly, another button was added to the iPhone 16 family; Although Apple insists on calling it Camera Control. Unfortunately, camera control is crude and incomplete both in terms of implementation and capabilities; I will explain further.

As usual with Apple, the camera control has a complex engineering behind its simple appearance. The surface of the control camera is made of sapphire and is surrounded by a stainless steel ring of the same color as the body. Under this surface, there is a precise force detection sensor with haptic feedback along with a touch sensor so that the camera control can simulate the shutter of DSLR cameras and recognize the swipe of the finger on the button surface.

Camera menu on iPhone 16 Pro

Apple says that by the end of this year, with a software update, it will add a feature to the camera control that will allow the user to focus on the subject by half-pressing the button and record the photo by fully pressing it, just like professional cameras and Xperia phones. On the other hand, after the release of Apple Intelligence, the user will have access to Siri’s image search function with the camera control.

control camera; An interesting idea, but very immature

Currently, with the camera control, you can take photos, record videos, or change camera parameters; Thus, by pressing the button once, the camera application is launched, now if you press the button again, a photo will be taken, and if you hold it, the video will start, and as soon as you lift the finger, the video will stop.

In the camera environment, if you gently press the button twice without lifting your finger, the photography parameters will appear, you can switch between the options by swiping on the button surface, and you can enter the desired parameter settings with another gentle press. Among the photography parameters available are exposure, depth of field, zoom, switching between cameras, Style, and Tone, and we will talk more about the last two in the following.

Camera control in the camera viewfinder
Control camera settings
Control camera settings 2

To be honest, for me and many of my colleagues at Zoomit, it was much easier and more straightforward to touch the screen to navigate through the camera menu than to use the camera controls. Still, after 10 days of working with iPhone 16 Pro, it is very difficult and time-consuming to go to the photography parameters section and swipe to adjust the parameters; For example, it often happens that while swiping to adjust the value of a parameter such as Tone, the phone decides to exit the Tone settings and move between parameters.

One of the problems of the camera control comes back to the firmness of its button; Therefore, when taking pictures with this button, the phone shakes; An issue that may end up blurring the details of photos in the dark.

Apart from the safety of the button, the placement of Camera Control is also not optimal in my opinion; When using the phone in portrait mode, especially with the Pro Max model, you are likely to have trouble and need to use both hands; If you use the phone with your left hand, sometimes your fingers may press the button and disrupt the phone’s functionality.

Applying changes to the color and lighting of iPhone 16 Pro photos

If Apple fixes the problems and bugs of the control camera, maybe it can be used in two cases; First, during zooming, because you can have more precise control over the zoom level, and second, for faster access to Apple’s new settings for the camera called Style and Tone, which are very useful for photography enthusiasts; Now I will explain the reason.

iPhones usually have their own style of photography; iPhone photos usually have colors close to reality with a relative tendency towards warmth, and there is no mention of saturated and high-contrast colors; Of course, Apple introduced the Photographic Styles feature with iPhone 13 to satisfy the fans of high-contrast photography in the style of Google Pixels by providing different photography styles.

the battle of the flags; Comparison of iPhone 16 Pro camera with Pixel 9 Pro and Galaxy S24 Ultra [survey results]

iPhone 16 Pro? Pixel 9 Pro XL or Galaxy S24 Ultra? Which phone has the best camera? The result will surprise you.

With the iPhone 15, Apple adopted a policy that was not very pleasant for the public; In short, in order to use all the capacities of the powerful Photonic Engine with the aim of preserving the details of the shadows and highlights, the iPhone goes a little too far in the implementation of HDR to the point where the colors and shadows lose their power and do not have the previous dramatic sense.

The bad news is that the iPhone 16 Pro follows Apple’s previous policy and, so-called, records the shadows weakly; But the good news is that now with the evolved version of Photographic Styles, you can breathe new life into shadows and colors. With the new version of Photographic Styles, you can change the type of skin color processing and shadows, even after taking photos, you can change the photography style.

Discover your photography style with the iPhone 16 Pro

Before we see the effect of photographic styles on photos, let’s talk about their different modes first. iPhone photography styles are now divided into two general categories: Mood and Undertone; Apart from the standard photography mode, 5 Undertone styles and 9 Mood styles are available. Undertone styles adjust the skin tone of human subjects more than anything else, and Mood styles offer functionality similar to Instagram filters.

Undertone styles are as follows:

  • Standard: iPhone’s default photography mode
  • Amber: Intensifies the amber tone in photos
  • Gold: Intensifies the golden tone in photos
  • Rose Gold: Intensifies the pink-gold tone in photos
  • Neutral: Neutralizes warm undertones in photos
  • Cool Rose: Intensifies cool-toned color in photos
Kausar Nikomanesh, Zomit writer in the editorial office - Standard iPhone 16 Pro photography style
Undertone: Standard
Kausar Nikomanesh, Zomit writer in the editorial office - Amber iPhone 16 Pro photography style
Undertone: Amber
Kausar Nikomanesh, Zomit writer in the editorial office - Gold iPhone 16 Pro photography style
Undertone: Gold
Kausar Nikomanesh, Zomit writer in the editorial office - Rose Gold iPhone 16 Pro photography style
Undertone: Rose Gold
Kausar Nikomanesh, Zomit writer in the editorial office - Neutral iPhone 16 Pro photography style
Undertone: Neutral
Kausar Nikomanesh, Zomit writer in the editorial office - Cool Rose iPhone 16 Pro photography style
Undertone: Cool Rose

Mood styles are as follows:

  • Vibrant
  • Natural
  • Luminous
  • Dramatic
  • Quiet
  • Cozy
  • Ethereal
  • Muted B&W
  • Stark B&W
Kausar Nikomanesh, Zomit writer in the editorial office - Vibrant iPhone 16 Pro photography style
Mood: Vibrant
Kausar Nikomanesh, Zomit writer in the editorial office - Natural iPhone 16 Pro photography style
Mood: Natural
Kausar Nikomanesh, Zomit writer in the editorial office - Luminous iPhone 16 Pro photography style
Mood: Luminous
Kausar Nikomanesh, Zomit writer in the editorial office - Dramatic iPhone 16 Pro photography style
Mood: Dramatic
Kausar Nikomanesh, Zomit writer in the editorial office - Quiet iPhone 16 Pro photography style
Mood: Quiet
Kausar Nikomanesh, Zomit writer in the editorial office - Cozy iPhone 16 Pro photography style
Mood: Cozy
Kausar Nikomanesh, Zomit writer in the editorial office - Ethereal iPhone 16 Pro photography style
Mood: Ethereal
Kausar Nikomanesh, Zomit writer in the editorial office - Muted B&W iPhone 16 Pro photography style
Mood: Muted B&W
Kausar Nikomanesh, Zomit writer in the editorial office - Stark B&W iPhone 16 Pro photography style
Mood: Stark B&W

All styles can be customized with three new parameters: Palette, Color, and Tone; The Palette parameter changes the range of applied colors, Color adjusts the intensity of color saturation, and most importantly, Tone can change the intensity of shadows and contrast and bring freshness back to iPhone photos.

While the Palette parameter is adjusted with a simple slider, you have to use a control pad to adjust color and tone. Working with this pad is very difficult and boring; Because to change the value of each of the two parameters, you have to put your finger on the head pad and since you have no feeling about the exact location of the finger, it is difficult to change the other parameter by keeping one parameter constant.

The iPhone 16 Pro photography experience is slightly different from the previous generation

If, like me, you don’t feel like messing around with the control pad and slider, you can directly access the styles or the Tone parameter with the camera control button and believe that you can increase the attractiveness of iPhone photos just by changing the Tone; For example, pay attention to the following two photos:

Standard mode with Tone -7
Standard mode with Tone 0 (default)

As you can see in the photos above, without changing the styles and simply by reducing the intensity of the tone, both the shadows have returned to the photo, and the black color of Mohammad Hossein’s t-shirt is visible better than before thanks to the improvement of the contrast of the image.

Ultrawide camera

Leaving aside the discussion of photography styles, the iPhone 16 Pro camera itself has undergone several major changes, the most important of which is the upgrade of the telephoto camera sensor from 12 to 48 megapixels; The new sensor uses a Quad-Bayer filter and 0.7-micrometer pixels; Therefore, it seems that the dimensions of the sensor itself are not different from the 1.2.55-inch sample of the previous generation with 1.4-micrometer pixels.

camera

Sensor

Lens

capabilities

Wide camera (main)

48-megapixel Sony IMX903

Dimensions 1/1.28 inches

1.22 µm pixels

Phase detection autofocus

Sensor-shift optical stabilizer

24 mm

Aperture f/1.78

12, 24 and 48-megapixel photography

4K120 video recording

Dolby Vision, ProRes, and Log

Portrait photography

Telephoto camera

12-megapixel Sony IMX913

Dimensions 1/3.06 inches

1.12 µm pixels

Dual Pixel phase detection autofocus

Sensor-shift optical stabilizer

120 mm

Aperture f/2.8

5x optical zoom

12-megapixel photography

4K60 video recording

Dolby Vision, ProRes, and Log

Portrait photography

Ultrawide camera

48 megapixels

Dimensions 1/2.55 inches

0.7 µm pixels

Phase detection autofocus

13 mm

Aperture f/2.2

12 and 48-megapixel photography

4K60 video recording

Dolby Vision, ProRes, and Log

Macro photography

selfie camera

12-megapixel Sony IMX714

Dimensions 1/3.6 inches

1.0 µm pixels

Phase detection autofocus

23 mm

Aperture f/1.9

12-megapixel photography

4K60 video recording

Dolby Vision, ProRes, and Log

In order for the pixels to capture the right light, the ultrawide camera by default captures 12MP photos by combining 4:1 pixels and achieving 1.4 micrometer pixels; But with the HEIF Max photography format, it is possible to shoot with 48 megapixels, so that the user has more freedom to zoom in on the photos.

A building with a stone facade and a yard full of trees - iPhone 16 Pro ultrawide camera - 48 megapixel photo
48-megapixel ultrawide photo – iPhone 16 Pro
A building with a stone facade and a yard full of trees - iPhone 16 Pro ultrawide camera - 12 megapixel photo
12-megapixel ultrawide photo – iPhone 16 Pro
A building with a stone facade and a yard full of trees - iPhone 16 ultrawide camera
12-megapixel ultrawide photo – iPhone 16
Cutting ultrawide photo of iPhone 16 and 16 Pro - air conditioner in the terrace of the apartment
Crop ultrawide camera photos

As you can see in the images above, the ultrawide 48 megapixel photo of the iPhone is somewhat more detailed in some parts; But it is generally softer than the 12-megapixel model. We also took photos of the same subject with iPhone 16; There is no noticeable difference between the 12 megapixel photos of the two phones.

View of the buildings around Zomit office on Pakistan Street - iPhone 16 Pro Ultra Wide Camera in the dark
Ultrawide iPhone 16 Pro camera with 1/25 second exposure
View of the buildings around the Zomit office on Pakistan Street - iPhone 16 ultrawide camera in the dark
iPhone 16 ultrawide camera with 1/10 second exposure
Cutting ultrawide photos of iPhone 16 and 16 Pro in the darkCrop ultrawide camera photos in the dark

iPhone 16 Pro goes to Night mode and long exposure much less than the iPhone 16 in dark environments; Therefore, sometimes its ultrawide night photos are less detailed than the iPhone 16; For example, in the photos above, the iPhone 16 is exposed for one-tenth of a second; While the exposure of the iPhone 16 Pro was 60% less and equivalent to one twenty-fifth of a second; So it is not surprising that the cheaper iPhone photo is more attractive!

iPhone 16 Pro ultrawide camera photo gallery

Children's playground with slide - iPhone 16 Pro ultra-wide camera
A tree from below and in front of sunlight - iPhone 16 Pro ultrawide camera
Tehran Book Garden - iPhone 16 Pro ultrawide camera
Super wide view of Tehran food garden - iPhone 16 Pro Ultra Wide Camera
Zoomit office terrace - iPhone 16 Pro ultrawide camera
Tehran Food Garden - iPhone 16 Pro ultrawide camera
Stone facade of a building - iPhone 16 Pro ultra-wide camera
The buildings of Pakistan Street in Tehran in the dark of the night - iPhone 16 Pro ultrawide camera
Zoomit studio in the dark - iPhone 16 Pro ultrawide camera
View of the buildings of Pakistan Street in Tehran at night - Ultrawide camera of iPhone 16 Pro
Sunflower flower close-up - iPhone 16 Pro ultrawide camera
Close-up of a yellow flower - macro photo of the iPhone 16 Pro ultrawide camera

The ultrawide camera of the iPhone 16 Pro generally takes attractive photos, But maybe it cannot be considered on par with competitors. The difference in performance with the best in the market is more noticeable in the dark; The iPhone 16 Pro’s ultrawide camera doesn’t appear so amazing in dark environments and records relatively soft photos. To evaluate the performance of the iPhone’s ultrawide camera against the competitors, I suggest that you read the comprehensive article comparing the 2024 flagship cameras.

Main camera

On paper, the main 48-megapixel camera of the iPhone 16 is no different from the previous generation in terms of sensor dimensions and pixels or lens specifications; But Apple calls this camera Fusion and claims that the sensor itself has become faster, and thanks to a new architecture called Apple Camera Interface, image data is transferred from the sensor side to the chip for processing at a faster rate; So now the main camera of the iPhone has the ability to record 4K120 Dolby Vision.

Record stunning videos with 120 frames per second video recording

HDR filming at a rate of 120 frames per second and 4K resolution requires very heavy processing; Because to implement the HDR effect, several 4K frames with different exposures must be compared and aggregated every second. If you have an external SSD and a high-speed USB 3 cable, you can also save 4K120 videos in professional ProRes and log formats, which give you more freedom when editing videos and correcting colors.

4K120 video sample 1

Watch on YouTube

4K120 video sample 2

Watch on YouTube

The 4K120 iPhone 16 Pro videos are very attractive and detailed and bring a wonderful visual experience to Armaghan. Since none of the 4K120 iPhone 16 Pro videos were uploaded properly to the app platform, you must refer to the YouTube links to watch the videos.

Thanks to the faster sensor and Apple’s new interface, 48-megapixel photos with HEIF Max format are recorded almost without pause and at a rate of about 4 frames per second. Like the previous generation, the iPhone combines multiple 12- and 48-megapixel frames, by default, it shoots at 24-megapixel resolution to provide a balanced combination of contrast, color, and detail; Of course, it is possible to take 12-megapixel photos alongside 48-megapixel HEIF Max photos.

Zomit office terrace - 48 megapixel photo of iPhone 16 Pro main camera
48-megapixel photo of the main camera
Zomit office terrace - 24 megapixel photo of iPhone 16 Pro main camera
24-megapixel photo of the main camera
Zomit office terrace - 12 megapixel photo of iPhone 16 Pro main camera
12-megapixel photo of the main camera
Crop photos of 12, 24 and 48 megapixels of the main iPhone 16 Pro cameraCrop photos of 48, 24, and 12 megapixels

As you can see in the photos above, the 48-megapixel mode improves the details to some extent at the cost of overall softening of the photo and gives you more freedom to zoom into the photo; But the contrast and concentration of its colors are at a lower level than the 24 and 12-megapixel modes. The 24MP photos seem to have a good balance of detail, color and contrast.

Mohammad Hossein Moaidfar, the author of Zoomit - iPhone 16 Pro's main camera
iPhone 16 Pro main camera
Mohammad Hossein Moaidfar, the author of Zoomit - iPhone 16 main camera
iPhone 16 main camera
Cropping the photo of the main camera of the iPhone 16 and 16 Pro

The main camera of the iPhone 16 Pro has recorded a little more detail in the photos above compared to the iPhone 16; But as you can see, the iPhone 16 Pro photo has a lower contrast, its colors are more warm than the iPhone 16, and the black color of Mohammad Hossein’s T-shirt does not match black enough.

iPhone 16 Pro main camera photo gallery

Children's playground - iPhone 16 Pro main camera
Candy with fruit decoration - iPhone 16 Pro main camera
An artificial lake around Tehran's book garden - iPhone 16 Pro main camera
Tehran book garden plants - iPhone 16 Pro main camera
Exterior view of Tehran Book Garden - main camera of iPhone 16 Pro
Two young people in the book garden of Tehran - iPhone 16 Pro main camera
Humvee military vehicle in Tehran's book garden - iPhone 16 Pro main camera
A view from inside the Tehran Book Garden - iPhone 16 Pro's main camera
A cat in the middle of the bushes - iPhone 16 Pro main camera in the dark
Orange motorcycle - main iPhone 16 Pro camera in the dark
Room ceiling lights - iPhone 16 Pro main camera with 2x zoom
Iranian Islamic view of Tehran mosque - iPhone 16 Pro main camera in the dark with 2x zoom
Brick facade of a building around Madras highway - iPhone 16 Pro main camera
Sugar goat statue in Tehran's book garden - iPhone 16 Pro main camera with 2x zoom
The statue of Zoro and Sergeant Garcia in Tehran's book garden - iPhone 16 Pro main camera with 2x zoom
The light in the garden - the main camera of the iPhone 16 Pro in the dark

The photos of the iPhone 16 Pro’s main camera have the same feeling as the iPhone 15 Pro; They are full of details, the colors appear relatively natural, and tend to be a little warm. The iPhone does not artificially remove noise as much as possible; Therefore, even in the dark, it pulls out a high level of fine and intricate details from the subjects. The large dimensions of the sensor allow the iPhone to record 2x high-quality photos by creating a 12-megapixel crop from the middle of the full-sensor image of the main camera.

Telephoto camera

In addition to the renewed ultrawide camera, another big change is the addition of a 5x telephoto camera to the iPhone 16 Pro; Last year, this camera was exclusive to the iPhone 15 Pro Max. The new telephoto camera uses the same 12-megapixel sensor as the previous generation and provides the user with digital zoom up to 25 times.

iPhone 16 Pro telephoto camera photo gallery

World War era motorcycle - iPhone 16 Pro telephoto camera
Single car in Tehran Book Garden - iPhone 16 Pro telephoto camera
iPhone 16 Pro telephoto camera - 2
Street lights in front of a tall glass building - iPhone 16 Pro telephoto camera
The Little Prince in Tehran's Book Garden - iPhone 16 Pro telephoto camera
iPhone 16 Pro telephoto camera
Locust airplane replica in Tehran book garden - iPhone 16 Pro telephoto camera
Mural on Madras highway - iPhone 16 Pro telephoto camera
Yellow motorcycle in the dark - iPhone 16 Pro telephoto camera
Building under construction near Madras highway - iPhone 16 Pro telephoto camera
2 Star Cafe on Pakistan Street - iPhone 16 Pro telephoto camera
Tehran Mosli minaret in the dark - iPhone 16 Pro telephoto camera

The iPhone 16 Pro telephoto camera records 5x high-quality photos; The level of detail and colors of the telephoto camera are very similar to the main camera and match its mood. The telephoto camera also excels in low-light environments and takes good photos in the dark. But as we said in the comprehensive comparison of 2024 flagship cameras, the competitors perform better in this field.

The main iPhone 16 Pro camera - the first example
1x photo
2x photo of iPhone 16 Pro
Double photo
3x photo of iPhone 16 Pro
3x photo
iPhone 16 Pro 5x photo
5x photo
10x photo of iPhone 16 Pro
10x photo
25x iPhone 16 Pro photo
25x photo

The combination of the iPhone 16 Pro’s 48-megapixel main camera and its 5x telephoto camera allows us to record relatively high-quality zoomed photos in the range of 1-10x; Apart from the 5x optical zoom, the iPhone looks quite satisfactory at 2x and 10x levels.

Portrait photography

The iPhone 16 Pro relies on the main and telephoto cameras for portrait photography and uses the ToF sensor to accurately separate the subject from the background. 1x and 2x portrait photos are recorded with the main camera and 5x portrait photos are also recorded with the telephoto camera.

Kausar Nikomanesh, author of Zoomit - 1x portrait photo of iPhone 16 Pro
1x portrait photo
Kausar Nikomanesh, the author of Zoomit - 2x portrait photo of iPhone 16 Pro
2x portrait photo
Kausar Nikomanesh, the author of Zoomit - 5x portrait photo of iPhone 16 Pro
5x portrait photo
1x portrait photo of iPhone 16 Pro
1x portrait photo
Mohammad Hossein Moaidfar, the author of Zoomit - 2x photo of iPhone 16 Pro
2x photo with natural bokeh
5x portrait photo of iPhone 16 Pro
5x portrait photo

The iPhone had a poor performance in portrait photography several years ago, and the iPhone 16 Pro follows the same rule. Portrait photos are detailed and the bokeh effect implementation is gradual and similar to professional cameras. As we saw in the 2024 flagship camera comparison article, the iPhone beats even tough competitors like the Pixel 9 Pro and S24 Ultra in portrait photography.

selfie camera

The selfie camera of the iPhone 16 Pro is no different from the previous generation, and it still captures eye-catching photos with many details and true-to-life colors.

Mohammad Hossein Moaidfar and Hadi Ghanizadegan from Zomit - iPhone 16 Pro selfie camera
Mohammad Hossein Moidfar and Hadi Ghanizadegan from Zomit - iPhone 16 Pro selfie camera with bokeh effect

iPhone 16 Pro with all its cameras is capable of recording 4K60 videos with Dolby Vision HDR standard; Of course, you can also choose 24 and 30 frames per second for filming. Videos are pre-recorded with h.265 codec, But it is also possible to switch to the more common h.264 codec.

We shot at 30 and 60 fps and h.265 codecs, and the iPhone 16 Pro recorded very detailed videos in both modes with vivid colors, high contrast, and decent exposure control; If you want to see the video recording performance in competition with other flagships, don’t miss the iPhone 16 Pro vs. Pixel 9 Pro and Galaxy S24 Ultra camera comparison article.

Performance and battery

The next big change to the iPhone 16 Pro comes back to its chip. A18 Pro uses the familiar combination of 2 high-power cores and 4 low-power cores as a CPU, and this unit is accompanied by a 6-core graphics processor and a 16-core neural processing unit. Apple’s new chip is produced with TSMC’s improved 3nm lithography called N3E.

Technical specifications of the A18 Pro chip compared to the previous generation

Specifications/Chip

A17 Pro

A18

A18 Pro

Central processor

2 powerful 3.78 GHz cores with 16 MB cache

4 low-power 2.11 GHz cores with 4 MB cache

24 MB system cache

2 powerful 4.04 GHz cores with 8 MB cache

4 low-power 2.0 GHz cores with 4 MB cache

12 MB system cache

2 powerful 4.04 GHz cores with 16 MB cache

4 low-power 2.2 GHz cores with 4 MB cache

24 MB system cache

A set of instructions

ARMv8.6-A

ARMv9.2-A

ARMv9.2-A

Graphics

6-core

1398 MHz

768 shading units

Ray tracing

5-core

1398 MHz

640 shading units

Ray tracing

6-core

1450 MHz

768 shading units

Ray tracing

Memory controller

4 16-bit channels

RAM 3200 MHz LPDDR5X

The bandwidth is 51.2 GB

4 16-bit channels

RAM 3750 MHz LPDDR5X

The bandwidth is 58.6 GB

4 16-bit channels

RAM 3750 MHz LPDDR5X

The bandwidth is 58.6 GB

Record and play video

4K60

10-bit H.265

8K24 / 4K120 10-bit H.265

8K24 / 4K120 10-bit H.265

Wireless connection

Bluetooth 5.3 and Wi-Fi 7

Bluetooth 5.3 and Wi-Fi 7

Bluetooth 5.3 and Wi-Fi 7

modem

X70 modem

Download 7500 MB in the UK

Upload is 3500 megabits per second

X75 modem

Download 10,000 megabits per second

Upload is 3500 megabits per second

X75 modem

Download 10,000 megabits per second

Upload is 3500 megabits per second

manufacturing process

3-nanometer TSMC

3-nanometer TSMC

(Enhanced: N3E)

3-nanometer TSMC

(Enhanced: N3E)

Apple says it uses new cores in the CPU, which results in 15% faster performance than the A17 Pro and achieves the same level of performance as this chip with 20% less power consumption. Apple claims that the A18 Pro uses more cache memory compared to the A18 chip.

The A18 Pro chip has faster single-core performance than even multi-100W desktop processors.

According to Apple, the 6-core A18 Pro graphics is 20% faster than the previous generation. Apple says the ray tracing accelerator in the new GPU is also a 100% improvement over the previous generation.

Playing mobile games on iPhone 16 Pro

The 16-core A18 Pro neural processing unit, like the previous generation, is capable of performing 35 trillion operations; But thanks to the 17% increase in bandwidth between the RAM and the chip, the new NPU performs better than before in real-world applications. The A18 Pro chip is connected to 8 GB LPDDR5x-7500 RAM with a high-speed memory controller.

iPhone 16 Pro performance against competitors

Product/benchmark

chip

Speedometer 2.1

GeekBench 6

GFXBench

Web browsing experience

GPU computing power

CPU computing power

Game simulator

Vulkan/Metal

Single/Multi

Aztec Ruins

Onscreen/1440p

Vulkan/Metal

iPhone 16 Pro

A18 Pro

572

33105

3542

8801

59

70

iPhone 16

A18

554

28025

3440

8406

59

61

iPhone 15 Pro

A17 Pro

475

27503

2960

7339

59

46.8

Piura 70 Ultra

(Performance Mode)

Kirin 9010

235

1528

(Failed)

1452

4494

32

30

Pixel 9 Pro

Tensor G4

221

6965

1945

4709

70

44

Galaxy S24 Ultra

Snapdragon 8 Gen 3 for Galaxy

240

17012

2262

7005

75

81

iPhone 16 Pro is noticeably faster than current Android flagships; The difference of about 60% in single-core CPU performance with the Galaxy S24 Ultra clearly shows how fast the iPhone 16 Pro appears in everyday use.

Apple’s 2024 flagship dictates its 95% advantage over a rival such as the Galaxy S24 Ultra when using the GPU for calculations such as blurring the background of photos and face recognition; However, in the rendering of games, the advantage is still with the Galaxy and the Snapdragon 8 generation 3 chip.

The performance of the neural processing unit of the iPhone 16 Pro against competitors

phone/parameters

framework

intermediary

Single count accuracy score (FP32)

iPhone 16 Pro

Core ML

Neural Engine

4647

iPhone 15 Pro

Core ML

Neural Engine

3862

Piura 70 Ultra

TensorFlow Lite

NNAPI

235

Pixel 9 Pro

TensorFlow Lite

NNAPI

347

Galaxy S24 Ultra

TensorFlow Lite

NNAPI

477

The neural processing unit of the iPhone 16 Pro outperforms the Galaxy S24 Ultra in the GeekBench AI benchmark by an astronomical 870%; Now we have to wait until the release of Apple’s artificial intelligence capabilities to see if such a difference is reasonable or just a bug in the benchmark software.

Like the previous generation, Apple sells the iPhone 16 Pro in versions of 128, 256, 512 GB and 1 TB with NVMe storage; While the base model of the iPhone 16 Pro Max uses 256 GB of storage space. Benchmarks show that the storage speed of the iPhone 16 Pro is no different from the previous generation.

iPhone 16 Pro storage speed compared to competitors

phone model

Sequential reading rate

Sequential write rate

iPhone 16 Pro

1636 megabytes

1340 megabytes

iPhone 15 Pro

1652 MB UK

1380 megabytes

Pixel 9 Pro XL

1350 megabytes

171 megabytes

Galaxy S24 Ultra

2473 megabytes

1471 megabytes

If we leave the numbers aside, we will face the fact that the feeling of using the iPhone 16 Pro in everyday use is not much different from the iPhone 15 Pro or even the iPhone 14 Pro. The performance gap between the new iPhone and the previous generations is the reason that the phone can still provide good performance with the standard of a few years later, and of course, it can handle the heavy processing of Apple Intelligence.

Apple says that with the changes made in the internal structure of the iPhone 16 Pro; Including the metal shell of the battery (pro model only), the phone can now perform up to 20% more stable in heavy usage. This performance stability improvement is felt to some extent; The phone does not get hot while playing graphic games and its performance drops less than before; In the Zomit stability test, the iPhone 16 Pro dropped less than the Galaxy S24 Ultra and the previous generation; The maximum temperature of his body reached 47 degrees Celsius.

In order to measure the performance stability of the iPhone 16 Pro in applications other than playing heavy games, we went to the CPU stress test; This test involves all CPU cores for 20 minutes and at the end shows what level of performance capacity the CPU provides after heating up under heavy processing load.

iPhone 16 Pro CPU stress test
CPU performance stability test under heavy processing load for 20 minutes

In our tests, the iPhone 16 Pro was able to provide 84% of its performance level to the user after 20 minutes; Therefore, the iPhone probably rarely lags and drops frames during very heavy use. In the CPU stress test, the body of the device reached about 45 degrees Celsius.

This year, Apple has increased the battery capacity of the iPhone 16 Pro and 16 Pro Max by about 10%; This issue, along with the A18 Pro chip’s excellence, makes the new flagships have very good charging; In such a way that Apple considers the iPhone 16 Pro Max as “the best iPhone in history in terms of charging”.

iPhone 16 Pro battery life in the battery menu

Cupertino residents announce the charging time of the new iPhones with the duration of video playback and say that the iPhone 16 Pro has 4 hours more charging time compared to the previous generation with 27 hours of video playback. Zomit tests also show 26 hours and 5 minutes of charging time for the new iPhone, which is more or less consistent with Apple’s claim.

iPhone 16 Pro battery life against competitors

Product/benchmark

Display

battery

Play video

Everyday use

Dimensions, resolution, and refresh rate

milliampere hour

minute: hour

minute: hour

iPhone 16 Pro

6.3 inches, 120 Hz

2622 x 1206 pixels

3582

26:05

iPhone 15 Pro

6.1 inches, 120 Hz

2556 x 1179 pixels

3274

21:11

iPhone 15 Pro Max

6.7 inches, 120 Hz

2796 x 1290 pixels

4441

24:43

Pixel 9 Pro XL

6.8 inches, 120 Hz

2992 × 1344 pixels (Native)

5060

25:00

13:25

Piura 70 Ultra

6.8 inches, 120 Hz

2844 x 1260 pixels

5200

25:00

17:00

Galaxy S24 Ultra

6.8 inches, 120 Hz

3088 x 1440 pixels

5000

27:41

14:05

Another change of the iPhone 16 Pro goes back to increasing the charging speed; Apple’s new flagship now supports wired charging with a power of 30 watts, and if the same charger is connected to the Magsafe wireless charging pad, the wireless charging power reaches 25 watts, which, according to Apple, can charge the battery from zero to 50% within 30 minutes.

Very good charging and beyond the last generation

Although the wired charging speed of the iPhone 16 Pro has increased from 20 to 30 watts; again, it takes about 100 minutes to fully charge the battery; Because both the battery capacity has increased by 10%, and the iPhone charges between 85 and 100% at a very low speed; Even with the optimal battery charging function turned off, the phone needs about 35-40 minutes to complete the remaining 15% of the battery capacity.

Design and build quality

Leaving aside the fundamental and significant changes of the iPhone, what you will notice at first glance is the increase in the size of the phone, especially in the iPhone 16 Pro Max, and the narrowing of the edges around the screen.

Home screen apps and widgets on the iPhone 16 Pro screen

iPhone 16 Pro and Pro Max use 6.3 and 6.9-inch screens with an increase of 0.2 inches in screen diameter compared to several previous generations; So it is not strange that the physical dimensions and weight also increase; Both phones are about 3 mm longer and 1 mm wider and 12 and 6 grams heavier, respectively; Therefore, the increase in the weight of the iPhone 16 Pro is more significant, and the 16 Pro Max sits worse in the hand than before and requires constant two-handed use.

Dynamic Island iPhone 16 Pro close-up

The borders around the display have become noticeably narrower; Now, around the screen of the iPhone 16 Pro, a border with a thickness of a little more than one millimeter (1.15 millimeters to be exact) is covered; While the thickness of the edges of the iPhone 15 Pro is about 1.5 mm, and it reaches more than 2 mm for the iPhone 16; Of course, you should pay attention that by putting the cover on the phone, the narrowness of the edges is less noticeable.

Dynamic Island iPhone 16 Pro
iPhone 16 Pro screen close-up

Another change in the appearance of the iPhone 16 Pro is the addition of the Desert Titanium color option to the device’s coloring and the removal of the Blue Titanium option. The new color is more similar to cream with a golden frame; But unfortunately, we didn’t have this color to review. Other color options are limited to neutral and understated Black Titanium, White Titanium, and Natural Titanium.

iPhone 16 Pro in hand

The design of the iPhone 16 Pro is no different from the previous generation in the rest of the parts; We see the same flat titanium frame with flat glass panels on the back and front of the phone, which are mounted with high precision and form a solid structure with IP68 certification. Unlike the iPhone 16, there has been no change in the painting process of the back panel and the arrangement of the cameras, only the screen cover has been upgraded to the third-generation ceramic shield, which, according to Apple, is twice as strong as the previous generation.

Camera control button on iPhone 16 Pro

 

We talked about Camera Control and its not very ergonomic location on the right side of the frame at the beginning of the article. Apart from this new button, the rest of the buttons are the same as the previous generation, the volume control buttons and Side button are in the right place and provide very good feedback, and the Action button, like the previous generation, allows you to personalize it.

Read more: Reviews of iPhone 14 Plus, price and technical specifications

Display and speaker

Finally, another not-so-variable part is the iPhone 16 Pro display, which uses the same 120 Hz OLED panel with LTPO technology; Of course, this year, due to the 0.2-inch increase in the diameter of the screen, its resolution reaches 2622 x 1206 pixels with a very good density of 460 pixels. As before, the display supports HDR standards including HDR10 and Dolby Vision; Therefore, like a few generations ago, we are either with a 10-bit panel or 8-bit + FRC.

Watch video with iPhone 16 Pro

Thanks to the LTPO technology, the iPhone 16 Pro display can change the refresh rate of the display between 1 and 120 Hz depending on the type and movement rate of the content, so that the phone can display smooth and smooth animations, and match the frame rate of games and videos. Do not damage the battery charging.

iPhone 16 Pro display performance against competitors

Product/Test

Minimum brightness

Maximum brightness

contrast ratio

sRGB

DCI P3

Adobe RGB

manual

automatic

local

cover

Average error

cover

Average error

cover

Average error

iPhone 16 Pro

1.35

1044

1950 (HDR)

99.7 percent

0.98

iPhone 15 Pro

2.21

1052

1947 (HDR)

99.7 percent

1.0

iPhone 15 Pro Max

2.15

1041

1950 (HDR)

99.7 percent

0.9

Pixel 9 Pro XL

4

1300

2650

(HDR)

97.2 percent

(Natural)

1.1

81.6 percent

(Adaptive)

3

Piura 70 Ultra

2.5

740

1500

(HDR)

99.7 percent

(Natural)

1.9

79.7 percent

(Vivid)

5.3

Galaxy S24 Ultra

0.7

914

2635

(HDR)

102 percent

(Natural)

3.5

81.8 percent

(Vivid)

4.4

Apple says that the iPhone 16 Pro display, like the previous generation, supports the wide color space of P3, achieves a brightness of 1000 nits in manual mode, and its maximum brightness reaches 2000 nits in automatic mode or during HDR video playback; But the important difference between the iPhone 16 Pro panel and the iPhone 15 Pro comes back to the minimum single-purpose brightness.

Zoomit measurements confirm Apple’s claims about the iPhone’s brightness; We measured the iPhone 16 Pro’s minimum brightness at 1.35 nits, which is significantly lower than the previous generation’s 2.15 nits; But the maximum brightness in manual mode and while displaying HDR videos is no different from the iPhone 15 Pro and is equal to 1044 and 1950 nits, respectively. It goes without saying that the iPhone 16 Pro achieved a brightness of 1296 nits in automatic mode while displaying SDR content (uses other than HDR video playback); But probably exposed to strong ambient light, it can approach the same range of 2000 nits.

iPhone 16 Pro Type-C port

iPhone 16 Pro uses stereo speakers, the main channel of which is located at the bottom edge of the frame, and the conversation speaker also plays the role of the second channel. Maybe the volume of the iPhone does not reach the level of competitors such as Pixel 9 Pro or Galaxy S24 Ultra, But the output quality of the speakers is at a higher level; The iPhone’s sound is clearer and its bass is much stronger than its competitors.

Summary and comparison with competitors

If we assume that the government will finally act rationally and start working on the iPhone registry, in this situation, it is not very logical for iPhone 15 Pro and even iPhone 14 Pro users to buy the iPhone 16 Pro with a few 10 million additional costs; Unless the 5x telephoto camera (for iPhone 15 Pro and both 14 Pro models), 15-30% faster chip performance, or Apple Intelligence (for iPhone 14 Pro users) is critical to them.

Users of iPhone 13 Pro or older models have more reasons to buy the iPhone 16 Pro; Better charging, more RAM, a more efficient camera, a brighter screen with Dynamic Island, a faster chip, and perhaps finally artificial intelligence capabilities, can all justify spending money to upgrade from the iPhone 13 Pro to the 16 Pro.

If the ecosystem is not a limiting factor for you, the Galaxy S24 Ultra, even a year after its launch and at a much lower price, offers you more or less the same experience promised by Apple Intelligence with Galaxy AI, and in most cases, in terms of photography and videography, it is on par with the iPhone 16 Pro and even better than It appears.

Naturally, we could not check the competitive advantage of the iPhone 16 Pro; Apple Intelligence is the focus of Apple’s marketing for this phone; But to experience all its capabilities, we have to wait until early 2025; However, a significant part of these features will be available on the iPhone 15 Pro with basically the same experience.

Working with iPhone 16 Pro

iPhone 16 Pro is a very attractive phone; But at least in the first month of its release, it is not in line with Apple’s philosophy; We know Apple as a company that provides mature and efficient functions and features to the user from the very beginning; But apparently, in the age of artificial intelligence, we have to get used to rudeness and delays; First it was the turn of Google, Microsoft and Samsung; Now Apple.

Continue Reading

Technology

Biography of Geoffrey Hinton; The godfather of artificial intelligence

Published

on

By

Geoffrey Hinton
Geoffrey Hinton, the godfather of artificial intelligence, revolutionized our world by inventing artificial neural networks. Do not miss the story of his ups and downs life.

Biography of Geoffrey Hinton; The godfather of artificial intelligence

Geoffrey Hinton (Geoffrey Hinton), a scientist who has rightly been called the “Godfather of Artificial Intelligence”, created a revolution in the world of technology with his research. Inspired by the human brain, he built artificial neural networks and gave machines the ability to learn, think, and make decisions. These technologies that are everywhere in our lives today, from voice assistants to self-driving cars, are the result of the relentless efforts of Hinton and his colleagues.

Hinton is now recognized as one of the most influential scientists of the 20th century, having won the 2024 Nobel Prize in Physics. But his story goes beyond awards and honors.

Geoffrey Hinton’s story is a story of perseverance, innovation, and the constant search to discover the unknown. In this article, we will look at the life and achievements of Geoffrey Hinton and we will answer the question of how one person with a simple idea was able to revolutionize the world of technology.

From physical problems to conquering the digital world

Hinton has been working stand-up for almost 18 years. He can’t sit for more than a few minutes due to back disc problems, but even that hasn’t stopped him from doing his activities. “I hate standing and prefer to sit, but if I sit, my lower back bulges out and I feel excruciating pain,” she says.

Since driving or sitting in a bus or subway is very difficult and painful for Hinton, he prefers to walk instead of using a private car or public transportation. The long-term walks of this scientist show that he has not only surrendered to his physical conditions but also to what extent he is eager to conduct scientific research and achieve results.

Hinton has been standing for years

For about 46 years, Hinton has been trying to teach computers like humans. This idea seemed impossible and hopeless at first, but the passage of time proved otherwise so much so that Google hired Hinton and asked him to make artificial intelligence a reality. “Google, Amazon, and Apple think artificial intelligence is what will make their future,” Hinton said in an interview after being hired by Google.

Google hired Hinton to make artificial intelligence a reality

Heir to genius genes

Hinton was born on December 6, 1947, in England in an educated and famous family with a rich scientific background. Most of his family members were educated in mathematics and economics. His father, Howard Everest Hinton, was a prominent entomologist, and all his siblings had done important scientific research.

Hinton knew from the age of seven that he would one day reach an important position

Some of the world’s leading mathematicians, such as George Boole, the founder of Boolean logic, and Charles Howard Hinton, a mathematician known for his visualization of higher dimensions, were relatives of Hinton. So, from a young age, there was a lot of pressure on Hinton to be the best in education, so much so that the scientist was thinking about getting a doctorate from the age of seven.

Hinton at the age of 7Geoffrey Hinton at seven years old
psychology, philosophy, and artificial intelligence; A powerful combination to create the future

Hinton took a diverse academic path; He began his primary education at Clifton College in Bristol and then went to Cambridge University for further studies. There, Hinton constantly changed his major, vacillating between the natural sciences, art history, and philosophy. Finally, he graduated from Cambridge University in 1970 with a bachelor’s degree in experimental psychology.

Hinton’s interest in understanding the brain and how humans learn led him to study artificial intelligence. Therefore, he went to the University of Edinburgh to continue his studies, where he began research in the field of artificial intelligence under his mentor, Christopher Longuet-Higgins. Finally, in 1978, Hinton achieved his seven-year-old dream and received his doctorate in artificial intelligence. The PhD was a turning point in Hinton’s career and prepared him to enter the complex and fascinating world of artificial intelligence.

Hinton’s diverse education, from psychology to artificial intelligence, gave him a comprehensive and interdisciplinary perspective that greatly contributed to his future research. This perspective enabled him to make a deep connection between the functioning of the human brain and machine learning algorithms.

Hinton decided to enter the field of physiology and study the anatomy of the human brain in his undergraduate course due to his great interest in learning about the workings of the human mind. After that, he entered the field of psychology and finally entered the field of artificial intelligence and completed his studies. His goal in entering the field of artificial intelligence was to simulate the human brain and use it in artificial intelligence.

If you want to learn about the functioning of a complex device like the human brain, you have to build one like it.

– Geoffrey Hinton

Hinton believed that in order to have a deep understanding of a complex device like the brain, one should build a device similar to it. For example, we normally think we are familiar with how cars work, but when building a car we will notice many details that we had no knowledge of before building it.

Only against the crowd, but victorious

While struggling with his ideas and thoughts and their opponents, Hinton met a number of researchers, such as Frank Rosenblatt (Frank Rosenblatt) in the field of artificial intelligence. Rosenblatt was an American scientist who created a revolution in the field of artificial intelligence in the 1950s and 1960s by inventing and expanding the perceptron model.

The perceptron model, one of the first machine learning models, is recognized as the main inspiration for the development of today’s artificial neural networks. Perceptron is a simple algorithm used to classify data. This model is inspired by the way brain neurons work. A perceptron is a mathematical model for an artificial neuron that receives various inputs, processes them using a weighted function, and decides on the output.

Hinton and Rosenblatt side by side
Hinton and Rosenblatt side by side

Rosenblatt’s hope was that one could feed a neural network a set of data, such as photographs of men and women, and the neural network, like humans, could learn how to separate the photographs; But there was one problem: the perceptron model didn’t work very well. Rosenblatt’s neural network was a single layer of neurons and was too limited to perform the assigned task of image separation.

Even when no one believed in artificial intelligence, Hinton didn’t lose hope

In the late 1960s, Rosenblatt’s colleague wrote a book about the limitations of Rosenblatt’s neural network. After that, for about ten years, research in the field of neural networks and artificial intelligence almost stopped. No one wanted to work in this field, because they were sure that no clear results would be obtained. Of course, nobody might not be the right word, and it is better to say almost nobody; Because the topic of artificial intelligence and neural network was completely different for Hinton.

Hinton believed that there must be a way to simulate the human brain and make a device similar to it. He had no doubt about it. Why did Hinton want to pursue a path that few would follow and almost no one saw a happy ending for? Thinking that everyone makes mistakes, this eminent scientist continued on his way and did not give up.

From America to Canada; A journey that changed the course of artificial intelligence

Hinton went to different research institutes in America during his research. At that time, the US Department of Defense funded many US research institutions, so most of the projects carried out or underway focused on military objectives. Hinton was not interested in working in the military field and was looking for pure scientific research and the development of technology for human and general applications. As a result, he was looking for a place where he could continue his research away from the pressures of the military and the limitations of dependent funds.

I did not want my research to be funded by military organizations, because the results obtained would certainly not be used for human benefit.

– Geoffrey Hinton

After searching for a suitable place to continue research, Canada seemed to be the most suitable option. Finally, Hinton moved to Canada in 1987 and began his research at the University of Toronto. In the same years, Hinton and his colleagues were able to solve problems that simpler neural networks could not solve by building more complex neural networks.

Hinton and his colleagues developed multilayer neural networks instead of building and expanding single-layer neural networks. These neural networks worked well and drew a null line on all disappointments and failures. In the late 80s, a person named Dean Pomerleau built a self-driving car using a neural network and drove it on different roads.

In the 1990s, Yann LeCun, one of the pioneers of artificial intelligence and deep learning, developed a system called “Convolutional Neural Networks” (CNNs). These networks became the basis for many modern techniques in machine vision and pattern recognition. One of the first important applications of these networks was to build a system that could recognize handwritten digits; But once again, after the construction of this system, researchers in the field of artificial intelligence reached a dead end.

In the 1990s, an interesting neural network was built, but it stalled due to insufficient data.

The neural networks built at that time did not work well due to the lack of sufficient data and the lack of necessary computing power. As a result, educated people in the fields of computer science and artificial intelligence once again concluded that neural networks and their construction were nothing more than a fantasy. In 1998, after 11 years at the University of Toronto, Geoffrey Hinton left Toronto to found and manage the Gatsby Computational Neuroscience Unit at University College London. During his research at this center, he studied neural networks and their applications.

AlexNet: A Milestone in the History of Artificial Intelligence

From the 1990s to 2000, Hinton was the only hopeful person on the planet who still believed in the development of neural networks and artificial intelligence. Hinton attended many conferences to achieve his goal but was usually met with indifference by the attendees and treated like an outcast. You might think to yourself that Hinton never gave up and moved on with hope, but that’s not the case. He was also sometimes disappointed and doubted reaching the desired result; But by overcoming despair, he continued his way no matter how difficult it was; Because this sentence kept repeating in Hinton’s mind: “Computers can learn.”

Watch: The story of the birth of artificial intelligence, the exciting technology that shook the world
Study ‘1

After returning to the University of Toronto in 2001, Hinton continued his work on neural network models and, together with his research group in the 2000s, developed deep learning technology and applied it to practical applications. In 2006, the world caught on to Hinton’s ideas and did not see them far away.

In 2012, Hinton, along with two of his PhD students, Alen Krizhevsly and Ilya Sotskever (the co-founder of OpenAI, the creator of ChatGPT), developed an eight-layer neural network program called AlexNet. The purpose of developing this program was to identify images in ImageNet, a large online database of images. AlexNet’s performance was stellar, outperforming the most accurate program up to that point by about 40 percent. The image below shows the architecture of Alexnet convolutional neural network.

AlexNet neural network

Viso

In the image above, C1 to C5 are convolutional layers that extract image features. Each layer has convolutional filters of different sizes that are applied to the image or output of the previous layer to detect different features. Also, the number of channels in each layer (96, 256 and 384) shows the number of filters used in that layer.

After feature extraction, the image is sent to fully connected layers (FC6 to FC8). Each circle in these layers represents a neuron that is connected to the neurons of the previous layer.

FC8 is the final output layer and consists of 1000 neurons. Due to the high number of layers and the ability to learn complex image features, the AlexNet architecture was very accurate in image recognition and paved the way for further improvements in the field of neural networks.

After developing AlexNet, Hinton and two of his students founded a company called DDNresearch, which was acquired by Google for $44 million in 2013. That same year, Hinton joined Google’s artificial intelligence research team, Google Brain, and was later appointed one of its vice presidents and chief engineers.

Hinton on Google

Businesstoday

From Backpropagation Algorithms to Capsule Networks: Hinton’s Continuous Innovations

Hinton has written or co-authored more than 200 scientific papers on the use of neural networks for machine learning, memory, perception, and symbol processing. While doing a postdoctoral fellowship at the University of California, San Diego, Hinton worked with David A. Rumelhart (David E. Rumelhart) and R. Wenald J. Williams (Ronald J. Williams) to implement a backpropagation algorithm on multilayer neural networks.

Hinton stated in an interview in 2018 that the main idea of ​​this algorithm was from Rumelhart, But Hinton and his colleagues were not the first to propose the backpropagation algorithm. In 1970, Seppo Linnainmaa proposed a method called inverse automatic derivation, which the backpropagation algorithm is a special type of this method.

Hinton and his colleagues took a big step in their research after publishing their paper on the error backpropagation algorithm in 1986. This article is one of Hinton’s most cited articles with 55,020 citations.

The number of citations of the 1986 article

Google Scholar

In October and November 2017, Hinton published two open-access papers on capsule neural networks, which he says work well.

At the 2022 Neural Information Processing Conference, Hinton introduced a new learning algorithm called forward-forward algorithm for neural networks. The main idea of ​​this algorithm is to use two forward steps instead of forward and backward steps in the error backpropagation method; One with positive (real) data and the other with negative data that only the network produces.

When the creator questions his creation

Finally, in May 2023, after about 10 years of working with Google, Hinton resigned from his job at the company because he wanted to speak freely about the dangers of the commercial use of artificial intelligence. Hinton was concerned about the power of artificial intelligence to generate fake content and its impact on the job market. Next, we read a part of Hinton’s words in an interview in 2023:

I think we’ve entered an era where, for the first time, we have things that are more talented than us. Artificial intelligence understands and has talent. This advanced system has its own experiences and can make decisions based on those experiences. Currently, artificial intelligence does not have self-awareness, but over time, it will acquire this feature. There will even come a time when humans are the second most talented creatures on earth. Artificial intelligence came to fruition after many disappointments and failures.

– Geoffrey Hinton

The supervisor of the doctoral course asked me to work on another subject and not to jeopardize my future work, but I preferred to learn about the functioning of the human brain and mind and simulate it, even if I fail. It took longer than I expected, about 50 years, to achieve the result.

At one point, the reporter asks Hinton at what point did you come to the conclusion that your idea about neural networks is right and everyone else is wrong? “I’ve always thought I was right, and I’m right,” Hinton replies with a pause and a smile.

With the advent of ultra-high-speed chips and the vast amount of data generated on the Internet, Hinton’s algorithms have reached magical power. Little by little, computers were able to recognize the content of photos, even later they were able to easily recognize sound and translate from one language to another. In 2012, words like neural networks and machine learning became the main words on the front page of the New York Times.

Read more: The biography of Ida Lovelace; The first programmer in history

From Turing to Nobel: The Unparalleled Honors of the Godfather of Artificial Intelligence

As one of the pioneers of artificial intelligence, Geoffrey Hinton has been recognized many times for his outstanding achievements. He has received numerous awards including the David E. Rommelhart of the Cognitive Science Society and Canada’s Gerhard Hertzberg Gold Medal, which is Canada’s highest science and engineering honor.

One of Hinton’s most notable honors was winning the Turing Award with his colleagues in 2018. This is a prestigious award in the field of computing, so it is referred to as the Nobel of Computing. This award was given in recognition of Hinton’s continuous efforts in the development of neural networks. In 2022, another honor was added to Hinton’s honors, when he received the Royal Society Medal for his pioneering work in deep learning.

2024 was a historic year for Geoffrey Hinton. He and John Hopfield won the Nobel Prize in Physics for their amazing achievements in the field of machine learning and artificial neural networks. The Nobel Committee awarded this valuable prize to these two scientists for their fundamental discoveries and inventions that made machine learning with artificial neural networks possible. When awarding the prize, the development of the “Boltzmann machine” was specifically mentioned.

When a New York Times reporter asked Hinton to explain in simple terms the importance of the Boltzmann machine and its role in pretraining post-propagation networks, Hinton jokingly referred to a quote from Richard Feynman :

Look, my friend, if I could explain this in a few minutes, it wouldn’t be worth a Nobel Prize.

– Richard Feynman

This humorous response shows that this technology is very complex and its full understanding requires extensive knowledge and study. Boltzmann machine is one of the first neural network models (1985), which as a statistical model helps the network to automatically find patterns in data.

Geoffrey Hinton is a man who turned the dream of machine intelligence into a reality by standing against the currents. From back pain to receiving the Nobel Prize in Physics, his life path was always full of ups and downs. With steely determination and perseverance, Hinton not only became one of the most influential scientists of the 20th century but also changed the world of technology forever with the invention of artificial neural networks. His life story is an inspiration to all who pursue their dreams, even when the whole world is against them.

Continue Reading

Technology

Everything about Cybercube and Robo Van; Elon Musk’s robotic taxis

Published

on

By

Elon Musk's robotic taxis
Elon Musk brought the idea of ​​smart public transportation one step closer to reality by unveiling Cybercubes and Robovans.

Everything about Cybercube and Robo Van; Elon Musk’s robotic taxis

After years of passionate but unfulfilled promises, finally on October 11, 2024 (October 20, 1403) at the WE, Robots event, Elon Musk unveiled Tesla’s robotic taxis.

Appearing on stage an hour late, Musk showed off the Cybercube self-driving taxi: a silver two-seater that moves without a steering wheel or pedals.

The CEO of Tesla further announced the presence of 21 Cybercubes and a total of 50 self-driving cars at the Warner Bros. studio (California), where Tesla hosted the event with special guests only.

Tesla Cybercab / Tesla Cybercab profile robotic taxi

Tesla

“We’re going to have a very glorious future ahead of us,” Musk said, but gave no indication of where the new cars will be built. According to him, Tesla hopes to offer Cybercubes to consumers at a price of less than 30,000 dollars before 2027.

The company will reportedly begin testing “unsupervised FSD” with Model 3 and Model Y electric vehicles in Texas and California next year.

Currently, the company’s self-driving cars operate with supervised FSD, meaning they require human support to take control of the steering wheel or brakes at any time. Tesla needs to get several permits from the regulators of different US states (or other countries) to offer cars without steering wheel and pedals.

But Cybercube was not the only product that was unveiled at this ceremony. Alongside the line-up of Optimus robots likely to launch as consumer work assistants in the coming months, the unveiling of an autonomous robotic van that can carry up to 20 passengers or be used to carry goods also generated more excitement among the audience.

Tesla Robovan side view

According to Musk, Robovans and Cybercubes use inductive charging and do not need a physical power connection for recharging. He also stated that “robovans” would solve the problem of high density and pointed to the transportation of sports teams, for example.

The CEO of Tesla has been drawing the dream vision of the company’s self-driving public transportation fleet for the shareholders for years and sees the company’s future in self-driving vehicles.

It is not bad to remind you that the WE, Robots event was the first product introduction event after the introduction of Cybertrack in 2019; The product, which entered the market in late 2023, has since been recalled 5 times in the United States due to various problems.

The event ended with Elon Musk’s “Let’s party” and a video of Optimus robots dancing, while Tesla’s CEO invited guests to take a test drive with the on-site self-driving cars inside the closed-off film studios.

However, experts and analysts of the self-driving car industry believe that the release of cybercabs will take longer than the announced schedule because ensuring the safety of these cars in scenarios such as bad weather, complex road intersections and unpredictable behavior of pedestrians will require many permits and tests.

Tesla shareholders still balk at Musk’s vague timetable for the production and delivery of new cars, as he has a poor track record of promising robotic taxis. But we cannot deny that this unveiling breathed new life into the world of self-driving technologies.

But where did the idea of ​​robotic taxis, which Tesla CEO claims are 10 to 20 times safer than human-driven cars and reduce the cost of public transportation, start?

Tesla Robovan next to Cybercube

Tesla

In 2019, during a meeting on the development of Tesla’s self-driving cars, Elon Musk suddenly made a strange prediction: “By the end of next year, we will have more than a million robot taxis on the road.”

Tesla’s investors were not unfamiliar with the concept of fully autonomous driverless cars, and what surprised them was the timing and short window of time of the plans that Musk was announcing. His prediction did not come true until the end of 2020, but has been postponed many times; But in recent months, with the decrease in Tesla’s interest rate, Elon Musk has tried in various ways to divert Wall Street’s attention from the company’s main activity and draw it to a new point. At every opportunity, he explains that the company’s future lies not in the production of electric cars, but in the very exciting world of artificial intelligence and humanoid robots.

According to him, one of the most profitable businesses in the field of AI will be driverless taxis or robotaxis that work almost anywhere and in any condition. Musk believes that Tesla’s market value will reach several trillion dollars after the release of these cars, although with this, Tesla will enter a highly competitive market.

Tesla’s technology will face fierce competition from Alphabet’s Waymo, Amazon’s self-driving unit Zoox, and General Motors’ Cruise. Also, ride-sharing companies such as Uber and Lyft and Chinese companies such as Baidu and BYD are considered serious competitors of Tesla.

Can robotaxis really save Tesla from declining profitability? How close is the company really to the production of driverless and fully autonomous car technology, and what guarantee is there for the success of Elon Musk’s plans to form a vast network of robotic taxis?

The start of the internal project of Tesla’s self-driving taxis

Elon Musk at the presentation ceremony of Tesla's autopilot system

Business Insider

Although Elon Musk has implicitly raised the idea of ​​robotaxis since 2016; the design and development operations of these cars took on a more serious color from 2022. At this time, during Tesla’s first-quarter earnings call, Musk once again announced that the company is building robotic taxis that do not have any steering wheel, pedals, or any other controller for physical human driving.

He also said that these cars will be fully self-driving and will be available to the public by 2024, when Tesla completes its self-driving car project. Sometime later, at the opening ceremony of the Gigafactory in Austin, he mentioned that the robotaxis would have a futuristic design and probably look more like a Cybertruck than a Tesla Model S.

Tesla’s robotic taxis have no steering wheel, pedals, or any other controls for physical human driving

During the same meeting, a Tesla investor asked Musk if the robot taxis would be offered to utilities or sold directly to consumers. Musk did not answer this question but continued to emphasize that robot taxis minimize the cost of a car per kilometer of distance, and the cost of traveling with these cars will be lower than a bus or subway ticket.

Sometime before Musk’s statement, Tesla announced that it is producing fully autonomous and self-driving vehicles at a cost of $25,000, which can have a steering wheel or not. For this reason, no one knows yet whether Musk’s robotaxis project refers to these cars or not.

According to the announced timeline, Tesla had 32 months to complete the construction, legal permits, and software required for the robot taxis and align with acceptable standards for “level 5 autonomy.”

At the beginning of 2024, the subject of robotic taxis made the news again. Elon Musk, who seemed fed up with Tesla’s usual car business, emphasized that Tesla’s future does not depend on selling more electric cars, but mainly on artificial intelligence and robotics.

Unlike Uber, which is Tesla’s main competitor in this project, Musk does not want to rely on Model 3 sedans and SUVs like the Model Y for the development of robot taxis. According to Tesla’s statement, the company is generally working on the production of new dedicated vehicles, which will probably be called Cybercab.

The supply of robotaxis depended on the completion of Tesla’s autopilot technologies and the so-called full self-driving systems, and exact statistics of how much consumers will accept this innovative product and what new rules will be imposed in this field were not announced.

Car design

Tesla Robotaxis concept design

Teslaoracle

In terms of design, the interior of the car was expected to have differences from other Tesla electric cars to meet the demands of passengers; For example, two rows of seats facing each other, or doors that open in a sliding manner and facilitate boarding of passengers. Also, a car that is used as a taxi should have provisions for simple and quick cleaning of the interior and even disinfection.

The idea of ​​robotaxis also received interesting design proposals from enthusiasts: some said it would be better for Tesla to optimize its public self-driving cars depending on different uses; For example, some of them have a place to rest for long distances, or others come with a monitor and several accessories that are suitable for working along the way.

Supporters said that these facilities improve people’s quality of life and even if a passenger devotes his travel time to something useful, he has saved the same amount of time.

Continuing speculation about the design of the Cybercube, a group of experts in the field of car research also said that in the coming years, Tesla could produce other vehicles that are suitable for special entertainment, such as watching movies, or other amenities for users who want to hang out with friends and fellow travelers along the way. To socialize yourself, have: just like sitting in a limousine.

The design of the Cybercube is similar to the Cybertruck van, but with doors that open from the top

But the initial design of the Cybercube, which was published on the Tesla website, was somewhat reminiscent of the Cybertruck, and there was no special feature even to make it easier for people with disabilities to ride.

Forbes also wrote in its latest report comparing self-driving cars of different companies that Tesla’s robot taxi will probably be a two-seater car with side-by-side seats and a retractable steering wheel because eventually, users will need a steering wheel to drive outside the areas that have the company’s support services. had

Tesla Cybercab Tesla Cybercab back and side view with open doors

However the final design of the Tesla Cybercube was not similar to the self-driving cars of the startup Zoox or Zeekr.

With doors that open up like butterfly wings and a small interior, this car only hosts two passengers. As we might have guessed, the Cybercube looks a lot like the Cybertruck, but it’s sleeker and more eye-catching than the controversial Tesla pickup.

Hardware

Tesla Cybercube Robotaxis

Sugar-Design

So far, Tesla has not disclosed any information about the set of sensors that will be used in the robotaxis. The company talks about Autopilot technologies on its website, but what Elon Musk has so far described as a fully self-driving, driverless car will require more advanced sensors, software and equipment than Autopilot.

Tesla Autopilot cars are equipped with multiple layers of cameras and powerful “machine vision” processing, and instead of radar, they use special “Tesla Vision” technology that provides a comprehensive view of the surrounding environment.

In the next step, Tesla Autopilot processes the data from these cameras using neural networks and advanced algorithms, then detects and groups objects and obstacles and determines their distance and relative position.

Tesla’s Autopilot system is equipped with multiple layers of cameras and powerful “machine vision” processing and uses “Tesla Vision” instead of radar.

Car driving functions also include two important eras: 1. adaptive cruise control with traffic detection that changes the car’s speed depending on the surrounding traffic; 2. The Autosteer steering system makes the car move on a certain line with the help of cruise control and continues the right path, especially when it encounters a curve in the road.

These cars can park automatically, recognize stop signs and other road signs as well as traffic lights, and slow down if necessary. Blind spot monitoring, automatic switching between road lanes, and intelligent summoning of the car by mobile application are some other features of these cars.

Despite all security measures, all Tesla Autopilot cars still require driver supervision according to national laws and the company’s own announcement. For this reason, until this company provides new specifications and information about the sensors, cameras, and systems of the robot taxis, no expert can check their efficiency or risk.

Introducing the Robotaxis application

The image of the map on the Tesla Robotaxis application

Tesla

In April 2024, Tesla released a brief report on the mobile application of robotaxis, and Elon Musk also said that the first of these cars would be unveiled in August (this date was later postponed).

In the initial images of the robotic taxis application, a button to call or summon a taxi and a little lower, the message of waiting time for the car’s arrival could be seen. The second image showed a 3D map and a small virtual vehicle following a path toward a waiting passenger. These images were very similar to the Uber app, except that it looked like a Tesla Model Y car was driving in it.

According to Tesla, passengers can adjust the temperature of the car as they wish when they are waiting for the taxi to arrive. Of course, other details such as the waiting time and the maximum passenger capacity of the car were also seen in the images of the application.

Passengers can adjust the temperature inside the car and their favorite music through the Tesla application

According to the published screenshots, in the next step when the vehicle reaches the origin and the passenger boards, the map view changes to the destination. Passengers can control the car’s music through the mobile application.

The app looks like a standard online ride-hailing app, but there’s no mention of the robotic nature of the car, which does all the driving automatically and autonomously. Elon Musk said in the same meeting:

You can think of Tesla’s robotaxis as a combination of Uber and Airbnb.

According to Musk, part of the fleet of robotic cars will belong to Tesla and the other part will belong to consumers. The owners of this group of robotic cars can give their cars to the taxi fleet whenever they want and earn money in this way.

Legal restrictions on removing the steering wheel and pedals

Tesla robot taxi without a steering wheel

independent

Despite all his previous promises, Tesla’s CEO has been evasive in past interviews when asked if the robotaxis will have traditional controls like pedals and steering wheels. Tesla’s Robotaxi plans have been heavily questioned due to delays in early prototype development, making the answer to the above question more important than ever.

The reality is that by mid-2024, in theory, it could take months or even years to approve a vehicle without pedals and a steering wheel for public roads, while a more traditional-looking vehicle could come much sooner.

In a letter addressed to its shareholders, Tesla emphasized that it would need the permission of the federal government to deploy and operate robotaxis with a more radical and progressive design. The statement also stated:

Scheduling robotaxis requires technological advances and regulatory approvals, but considering their very high potential value, we intend to make the most of this opportunity and are working hard on the project.

Elon Musk also did not respond to a question about exactly what type of regulatory approval Tesla is seeking.

He was then asked by reporters if Tesla was seeking an exemption from the Federal Motor Vehicle Safety Standards (FMVSS) to develop and market a car without traditional controls. In response, Musk compared Tesla’s new product to Waymo’s local self-driving cars and said that products that are developed for local transportation are very vulnerable and weak. He added:

The car we produce is a universal product that works anywhere. Our robotaxis work well on any terrain.

Currently, car manufacturers must comply with federal motor vehicle safety standards that require human controls such as steering wheels, pedals, side mirrors, and the like. These standards specify how vehicles must be designed before they can be sold in the United States, and if a manufacturer’s new product does not meet these requirements, manufacturers can apply for an exemption; But the US government has set a limit of 2,500 cars per company per year.

The regulatory exemption cap would theoretically prevent the mass deployment of purpose-built self-driving vehicles from any AV company, including Tesla. To date, self-driving car advocates have tried hard to pass legislation to cap the number of driverless cars on public roads; But the bill is apparently stalled in Congress due to questions about the technology’s “level of reliability” and readiness.

Tesla will need an FMVSS exemption if it wants to remove the steering wheel and pedals from its self-driving cars

So far, only Nuro has managed to obtain an FMVSS exemption, allowing it to operate a limited number of driverless delivery robots in the states of Texas and California.

For example, General Motors’ Cruise unit applied for a waiver for Origin’s steering-less and pedal-less shuttle, but it was never approved, and the Origin program was put on hold indefinitely.

Tesla Cybercab interior view and seats
Tesla Cybercab Tesla Cybercab interior view and interior space

Startup Zoox (a subsidiary of Amazon) also announced that its self-driving shuttles are “self-certified”, prompting the US National Highway Traffic Safety Administration to launch new research to understand this newly invented concept. Issues such as strict legal processes and approval of the license caused other companies in this field to completely ignore the issue of removing the steering wheel and pedals. For example, Waymo’s self-driving cars, although operating on public roads without a safety driver, have traditional controls. Some time ago, the company also announced that it would finally introduce a new driverless car, but did not specify an exact date for it, nor did it mention FMVSS exemptions.

Thus, now that it has been determined that the final Cybercube car will be produced without traditional controls, Tesla must also pass similar regulatory hurdles.

The challenges of mass production of Tesla robotaxis

Tesla's robot taxi design

Sugar-Design

Apart from persuading the regulators and getting a city traffic permit, there have been many other challenges standing in the way of the success of the robotaxis project, some of which Tesla has passed and has not found an answer for others.

For example, Tesla claims that it has reached a reliable milestone in terms of technologies and hardware infrastructure, but incidents such as the crash of Uber’s self-driving car in 2018, which killed a pedestrian, or two severe crashes of cruise cars in 2023, have a negative public view. It followed people into driverless cars.

On the other hand, the current infrastructure of most American cities is designed for conventional cars and must be updated and developed again to support the multitude of robotic taxis . For example, installing smart traffic lights that can communicate with self-driving cars and provide them with real-time information is one of the basic needs of robot taxis. Also, the presence of clear road lines and legible traffic signs is very important for self-driving car sensors.

The mass production of robotaxis requires changing the road infrastructure

Contrary to Musk’s claim that “the roads are ready for permanent robot taxis,” self-driving cars from other companies are still plying urban and intercity roads in certain parts of the United States. Until July 2024, Tesla had about 2.2 million cars on American roads, which is far from Elon Musk’s target of a fleet of 7 million cars.

In the second stage, Tesla’s self-driving cars are equipped with advanced technologies such as a variety of cameras and sensors and data processing systems, which, apart from the cost of production, also increase the cost of maintaining equipment and keeping software up-to-date.

In the past year alone, some Tesla customers have been forced to pay an extra $12,000 to upgrade their cars’ self-driving capabilities, while there’s still no news of new features.

If we consider the average price of robotaxis between 150,000 and 175,000 dollars, it is not clear how long Elon Musk’s promise to potential buyers about the revenue-generating potential of these cars will take. Unfortunately, Musk’s prediction regarding the annual gross profit of 30,000 dollars for the owners who give their cars to other passengers does not have statistical and computational support.

Developing new insurance models for self-driving cars will be one of Tesla’s serious challenges

The development of suitable insurance models for self-driving cars will also be one of Tesla’s serious challenges, because insurance companies must be able to correctly assess the risks and possible costs of robotaxis; Therefore, Tesla must cooperate with insurance companies from different angles to reach a comprehensive plan that both customers and companies are satisfied with.

In addition to paying attention to technological and legal issues, Tesla must gain people’s trust in its new series of fully automatic and driverless cars, and for this purpose, it will be necessary to hold advertising campaigns and extensive training programs to familiarize consumers with the company’s technologies and reduce the concerns of end users. was

The status of the project in 2024 and the concern of shareholders

Tesla Cybercab / Cybercab in the city

Tesla

In 2024, Elon Musk postponed the unveiling date of robotaxis to August 8 and then to October 10. In April, he told Tesla’s investors, who were frustrated by the uncertain progress of production of the cars.

All the cars that Tesla produces have all the necessary hardware and computing for fully autonomous driving. I’ll say it again: all Tesla cars currently in production have all the prerequisites for autonomous driving. All you have to do is improve the software.

He also said that it doesn’t matter if these cars are less safe than new cars, because Tesla is improving the average level of road safety. A few weeks later, he released another video in which he summarized meeting Tesla’s goals in three steps:

  • Completing the technological features and capabilities of fully autonomous vehicles
  • Improving car technology to the point where people can ride driverless cars without any worries
  • Convincing the regulators that the previous option is true!

While other companies producing self-driving cars go from city to city to obtain the necessary permits and try to expand the range of activities of cars by proving the safety of their products, NBC recently revealed in its report that Tesla even the states of California and Nevada, which employ the most employees. has not received a license to test these cars.

Tesla has not yet received permission to test robotaxis in the US states

In July, Musk told investors that anyone who does not believe in the efficiency and value of robotaxis should not be a Tesla investor. Meanwhile, the California Department of Motor Vehicles recently filed a lawsuit against Tesla, accusing the company of falsely advertising Autopilot and fully automated systems.

In addition to specifying the monthly cost and upfront payments for fully autonomous cars, the case also addresses the issue that both types of systems require drivers to be behind the wheel and control the vehicle’s steering and braking.

The unveiling of the Cybercubes in October 2024 seemed to ease the mind of Tesla shareholders somewhat, but during the night of the company’s big event, some of them expressed their concern about Musk’s uncertain timings to the media.

What do experts and critics say?

Some critics say that it is not possible for Elon Musk’s robot taxi to be produced and released according to his schedule. Referring to Waymo vehicles that make 50,000 road trips every week, this group considers Tesla’s silence regarding the request for technical information of the vehicles unacceptable. From their point of view, Musk is just continuing to make vague promises about this car.

In response to critics, Elon Musk emphasizes one sentence: that Tesla is basically an artificial intelligence and robotics company, not a traditional automobile company. So why doesn’t he try to clarify the obstacles that stand in the way of Tesla’s actions to realize the old idea?

On the other hand, the technology academies remind that Tesla’s systems have not reached level 5 autonomy, that is, conditions that do not require any human control. The MIT Technology Review writes:

After years of research and testing of robotic taxis by various companies on the road, mass production of these cars still has heavy risks. To date, these vehicles only travel within precise, pre-defined geographic boundaries, and although some do not have a human operator in the front seat, they still require remote operators to take over in an emergency.

R. Amanarayan Vasudevan , associate professor of robotics and mechanical engineering at the University of Michigan, also says:

These systems still rely on remote human supervision for safe operation, which is why we call them automated rather than autonomous; But this version of self-driving is much more expensive than traditional taxis.

Tesla is burning direct investor money to produce robotaxis, and it is natural that more pressure will be placed on the company to get more reliable results. The balance between costs and potential revenues will come when more robotaxis hit the roads and can truly compete with ride-sharing services like Uber.

Despite numerous legal, infrastructural and social challenges, the unveiling ceremony of Cybercube and RoboVon puts not only the technology levels of self-driving cars but the entire transportation industry on the threshold of a huge transformation. The entry of Tesla’s robotic taxis into the market can even affect the traditional taxi service model, but how long will it take for the transition from this unveiling to the actual launch

Continue Reading

Popular