Does the saying "physical access = game over" apply to smartphones, too?
"Physical access = game over" is an over-simplification. It absolutely boils down to the outcome of a threat assessment, or what the vendor needs to protect and to what level. The direct answer to your question is a great big 'it depends'.
Smartphones are no different than other devices to the extent that they are computers running an operating system of some description handling data of some kind that interact with other computers or people in some way via peripherals.
The class and range of attacks a device is susceptible to when physical access is possible is very different to the type of attacks it would be susceptible over a network. Conversely, the impact on the ecosystem is also quite different and it could be as impacting or worse on the network side.
Modern/recent operating systems and smartphone hardware have multiple methodologies that aim to secure user data from attackers, whether by means of physical attacks or otherwise. Even "physical attacks" can vary between occasional access (a few minutes, casual access) to unlimited time and expertise in micro-electronics in a lab (such as forensics investigations). But there are aspects that can defeat some (or all) of these features such as local configuration of the device, having weak passwords, guessable (or no) PIN codes, etc. Online backup services, cloud based accounts (Apple/Google) aid in these vectors since most of the data in a device ends up mirrored on the cloud in some way.
However, not all smartphone hardware is born in the same way and not all operating systems in the field are implemented to the same security strength, so there are attack vectors whereby full access is possible against certain hardware/software combinations provided physical access is possible.
This is a very short summary, there is scope to this matter to write books.
As a general concept in information security, physical access is a rather severe attack vector.
Standard x86 PCs and servers are particularly vulnerable because they have little or even no mitigations against physical threats. Some things like disk encryption can help, but it is just not a significant design feature.
Smartphones treat the threat more seriously; being lost or stolen is a much more present hazard. Unlike commodity PC hardware, end-users do not anticipate being able to install hardware or run arbitrary software, allowing tighter more tamper-resistant casing and proprietary hardware with strict security features. Added snooping hardware would need to be quite small. Breaking into the device requires some real reverse engineering or even using an exploit. They are only countermeasures, and smartphones are vulnerable to several physical attacks, they just tend to require more time, skill, and dedication. It is part of a defense in depth strategy, where the first defense is physical security.
TL;DR: the answer is yes, given enough (unrestricted) physical access, skills, motivation, and resources.
Those laws are often very general laws that express general concepts in information security. In this case, the law says that the attacker generally needs unrestricted physical access. And when they say it's not your computer anymore, it means that it's very, very hard to defend against such attacks, or even impossible, depending on how unrestricted the access is. So the key here is the word unrestricted. Also, of course, as usual, it depends on the skills and resources of the attacker.
So, if the attacker is free to physically access your smartphone and has enough skills, motivation, and resources, is it "game over", meaning that "it's not your smartphone anymore", meaning that it is going to be extremely hard, even impossible, to prevent the attack? The answer is yes. All you would need is an advanced evil maid attack.
The attacker can check out your phone, what it looks like, what you need to unlock it. The attacker, given enough physical access to your environment, might even be able to collect some information about you (like how you use the phone, what apps you have, what settings are enabled, etc.), either directly (the attacker lives with you), or indirectly (the attacker has installed hidden cameras). Then they can steal your phone, and replace it with an identical copy, with special software installed, such that as soon as you unlock the copy the authentication info is sent to them. Your fingerprint? A representation of it can be sent as well. You have "find my phone and erase the data" enabled? It won't work, because the attacker is now working in a shielded basement, and your phone has no signal whatsoever.
As a thought experiment, I just thought that in theory you could even devise a method where the copy of the device has a modified OS that syncs with the stolen phone. It might be very slow at the beginning, while it is syncing for the first time and installing all the apps and importing all the settings, and this initial process might be hidden behind a fake OS update. But in the end you will end up having a fully functional copy of your phone, running on a modified OS that is controlled by the attacker. Then Business Process Compromise would become possible. Of course this isn't an attack you can expect from your girlfriend or your grandma. But if your threat model involved government agencies, then who knows what they are capable of. Hell, for all we know, the NSA might already have these handy phone copies. Or should I patent this invention? LoL