Suggest several things for Tesla: 1. Anyway, personal driving data is private, and users have the right to know their own data. Tesla can get through the relationship between this layer and the app. Every time a driving cycle, users can see the driving data returned by Tesla in real time from the background. There is no problem of tampering or giving or not in this process, so there are many problems. That’s not a problem. Those who are clean are self-cleaning, where do they need to prove their innocence? 2. As for installing a few more cameras in the car, that is another matter. Either it is recommended to move the camera in the car that monitors the driver’s status to the top of the driver’s head or the position of the armrest box to take pictures of the user’s driving process. This consideration is: Anyway, everyone does not want Tesla to monitor their privacy. You should not take pictures of what I did in the car. Anyway, now everyone wants to prove their innocence. It is better to take pictures of me. 3. No longer publicizing “autonomous driving” and changing to “driving assistance, the driver must bear the corresponding responsibility” to prevent individual users from misjudging the actual situation, and avoid L2 automatic driving being forcibly applied to the L5 scene by users and causing failure The word-of-mouth problem that followed. 4. The development process does not have to be AUTOSAR or 26262, but it is necessary to re-examine the vehicle development process, treat software development more cautiously, release functions more cautiously, and verify more cautiously, instead of “new” for the sake of “new”. And to cherish the lives of users more. The phone can be replaced if it is broken, the car is broken, and there is only one life. 5. Cancel the FSD payment function, push the full FSD function to users who have purchased Tesla and prompt “FSD Full self driving is just a vision, not a true fully automatic driving. Pushing these functions only helps to improve the user experience, please do not Over-reliance”, for subsequent users, users who do not need to purchase FSD functions will perform hardware reduction, and users will no longer be allowed to pay for hardware for functions that “do not need, do not want to buy”. I understand these functions. Software and hardware are matched together. If I don’t need FSD, I don’t need software or hardware. If I have to pay for the hardware, it is a strong buy and sell. This is mainly used to improve the existing user experience and provide a choice for subsequent users.

zhiwo

By zhiwo

0 0 vote
Article Rating
Subscribe
Notify of
guest
11 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
helpmekim
5 months ago

When I saw the statement made by the victim, it was like hearing the driver report a malfunction when I was doing my own experiment. There are several key points in the past: 1. Before the accident, the normal driver of the vehicle was driving, not the system, and could change lanes normally. 2. Suddenly the steering wheel control was taken away by the system and the driver could not intervene. 3. The driver said that he could not identify and isolate. I just saw that there will be questions about the first point. In fact, the driver is driving the car and listening to the song normally, walking well, there is nothing unusual. At the second point, the problem occurred. This is equivalent to an unexpected operation of the vehicle control. It can be understood as an abnormal system function, a logic that should not appear, and the system cannot be exited under human intervention. I first wonder if there is any logic in the control logic that the system will automatically intervene in an emergency. But even with this logic, it shouldn’t be right. At least one strategy should be judged bad. The system priority is greater than the artificial priority. It is not unscientific at present. If there is data, you can call the AP working status signal at the time, as well as the steering force and steering angle signal, as well as the accelerator and brake pedal signals to analyze and judge whether it is unexpectedly turned on. You can also read the AP system and steering system logs to see if there are any historical failures. Steering system failure may also make the driver feel this way. The third point is about the driver’s verbal inability to recognize the pier. I tend to suspect that the driver’s judgment is wrong. Because the camera and radar did not perceive the stump, the probability is still a bit low. According to the usual working conditions, I will definitely be suspicious of what the driver said. If possible, I will test the car in the same or similar working conditions many times to see if the recognition can be reproduced. Finally, if the three situations they said are true, then I will definitely give him a Class S fault for this problem, which will be reported immediately, and the test will be suspended. This has caused the system to grab the human right to operate. This problem is really true. But it’s too big or small, maybe it’s a car crash, and I don’t want my driver and my performance bye bye. Tesla’s current problem is really a bit strenuous. Both sides hold different opinions, and no one can convince each other. In the future, Tesla’s cars will really be the same as our usual proving grounds. One has in-vehicle monitoring, two has field monitoring, and three has data mining equipment to collect on-board bus data and private Can data, so that sometimes it can be judged whether the driver is lying. Whether there is a problem with the car, Tesla’s cars will be able to do so in the future.

heloword
5 months ago

It’s Tesla and Autopilot Autopilot is the future trend, and I have always been very optimistic about its development. But autonomous driving belongs to the future and may not be part of our generation. If you have to forcefully use autonomous driving, you can only watch tragedies happen one by one. Of course, new trends appear, and everyone is eager to try it. This is normal. But the premise is that you have to be professional, you need to predict the intention of autonomous driving and keep your focus ready to take over at any time. Many people may have watched the video of the car critics and found it very interesting, but unfortunately they did not add the sentence “Don’t imitate” after the video. In addition, some companies vigorously promote their autonomous driving, which will drive consumers away, and these companies will not take any responsibility for autonomous driving accidents. Car critics are professionals. They can predict what the car will do automatically in the next step. They also know how hard they can take the steering wheel back and know how to step on the brake pedal is most effective. I believe you have not seen any autopilot test where the driver abandons the steering wheel and does not take the video with his hands, right? This is why the testers say that autonomous driving is more tiring than driving by yourself. If ordinary drivers do not have such professional capabilities and cautious attitude, it is best not to use automatic driving frequently. Of course, many fanatics still have to give it a try. This is also a driving force for industry progress. When the future of autonomous driving is successful, we need to thank the testers who risk their lives every day, and we must also thank all the fanatics of cutting-edge technology. Many people say that autonomous driving will be a revolution in the automotive industry. This reminds me of Tan Sitong’s words: Every country has to shed blood to become autonomous driving. It is necessary to sacrifice and shed blood, but it cannot be compared with our revolutionary martyrs.

helpyme
5 months ago

What the driver said needs to be confirmed. After all, people have the probability of lying and more evidence is needed. But the problem is there. When I answered the brake failure, I said that the driver has the highest control. Tesla’s autonomous driving does not use lidar, which has certain safety risks. In the first autonomous driving accident, the white truck deceived the visual recognition, and the height of the chassis deceived the millimeter wave radar. Hit the head dead. It is possible to fail to identify the obstacle this time. In the eyes of Tesla’s camera, will the concrete wall on the right be recognized as a widened road under the light? Lidar can detect it. What can the millimeter wave radar on the Tesla car detect? Lidar will be cheaper in the future. There are many sensors and redundancy, which can avoid this problem. But the problem of supreme control is serious. Now all cars are telex. People don’t directly control the accelerator, brake, and steering wheel. Old cars are mechanically connected. However, after the introduction of automatic driving, the trip computer has control rights. The driver’s behavior is only the input signal, the sensor is also the input signal, and even the network signal for remote control in the future is also the input signal. Who has the highest priority? The computer has the final say. It’s a bug now. From now on, the hacker will be thousands of miles away and command your car to accelerate to 250 kilometers per hour and hit the wall. The door does not open, the brakes are not working, the accelerator is not working, and the steering wheel is not working. You can only watch desperately in the car. This problem has been exposed in the Toyota US case. If the code is not well written, it is really deadly. It is still a fuel vehicle. It’s just a matter of braking. There are so many things open for self-driving electric cars now than Toyotas back then. Are there no bugs in cars on the road now? Therefore, whether electric vehicles or fuel vehicles, there should be a bottom line plan for safety issues. The driver must have the highest authority in the trip computer to control the direction of the accelerator and brake by himself. Under certain circumstances, the driver can bypass the computer and directly control the machine. For example, there is a clutch similar to a manual transmission that disconnects the power, causing the throttle control to fail. The brake has a set of emergency brakes that bypass the brake system, and the direction has a set of hard connections to disconnect the electronically controlled booster. This set of controls can be placed on the left foot. Put it in a place that is awkward to step on. Once stepped on, it switches to the human control state. Deprive the control of the trip computer to respond to emergencies.

sina156
5 months ago

Had to come up with things that have been cherished for a long time. The worst time that autopilot was hacked! Autopilot was hacked in the classic movie scene “Speed ​​and Furious 8” The classic scene of the hacker remotely controlling the car in Super 8, has been cited by countless people, as one of the standard textbooks and classic cases for opponents of autopilot. However, the movie is after all The movie is implemented in the real scene, in fact, everyone does not need to be afraid. After all, the current new energy vehicles will have anti-theft verification every time they are powered on, and each time a random code will be generated for verification, and only after passing the high voltage. It is possible to drive. Of course, not to mention that every company has a different gear shift strategy. Then, in the risky scenario of taking a step back, autonomous driving grabs the steering wheel, this is no joke. The risk of being hacked is not lower, after all, it is a sudden situation during driving. The cause of the accident is still unknown. But I think it is necessary to discuss the exit mechanism of automatic driving during the operation of personnel. At present, operations with obvious intentions such as active steering and braking by personnel will turn the vehicle into a human takeover. Back to this question, the problem described by the user is: “AP robbed my steering wheel, causing me to be unable to return to the right.” In fact, such a description is easy to get angry. It’s easy to lead people into the ditch. I personally think that the problem of the steering wheel jam is very likely not related to ap. Since the active steering to the right is mentioned earlier, it means that the steering wheel control has been withdrawn and the driver has taken over. In the later situation, it should be that the steering wheel is stuck (specifically whether the AP grabs the steering wheel, and needs to reconfirm the detailed phenomenon, such as the detailed description of the return at the time, the specific phenomenon of the steering wheel, which is completely stuck? Or continue to turn right? How about the specific situation?) Need a detailed description, and some respondents mentioned that they want to try to reproduce…It is probably too difficult and I don’t know what is going on. I feel that the whole network is taking the lead in encircling and suppressing the big brother. After talking about the brake failure and accelerating accidentally, now it is said that AP grabs the steering wheel… I think it is a bit demonized… Please be rational. Let the bullet fly for a while. However, this accident did happen, and I believe in the driver’s feelings and descriptions. But feelings and descriptions are not equal to the real information and data of the vehicle. I still hope that Tesla will make good use of the data and do a good job in self-certification. After all, we still depend on the data

yahoo898
5 months ago

The feature of AP grabbing the steering wheel is that there must be a function of AP that is to maintain the center of the lane. It can turn or go straight along with the lane, so AP will take over the steering function of the steering wheel. If you want to take over the control, there are four ways: cancel the AP, In this way, the steering wheel control will return to the driver, instead of grabbing the steering wheel. Use a little more force to break the steering wheel. The AP will be automatically cancelled and the control will return to the driver to turn on the turn signal, so that the AP will not grab the steering wheel with you when the driver turns the direction. Control is over, and after turning the direction at the same time, the AP cancels and turns to acc adaptive cruise. The brakes are also automatically canceled. The steering wheel control is handed over to the driver. There is another situation in which there is a lane in the driving assistance function setting. Deviation avoidance, if set to assist, then when you change lanes without turning on the turn signal, you will also grab the steering wheel with you, but the force is slightly lower than AP. That is, this phenomenon will occur if the AP is not turned on. This design is fully verified, and it is reasonable why it takes a little bit more force to break the steering wheel to regain control, because there may be human misoperation. The vehicle itself does not understand the driver’s intention. If the AP is cancelled by just touching the steering wheel, the driver may not know that the AP has been cancelled at this time, which is very dangerous. Then apply a greater force to break the direction, and the AP will understand that the driver’s clear intention is to actively take over the control, so it will be handed over. In addition, the driver is required to keep his hands from the steering wheel during AP execution, so some force will be applied. You cannot tell whether the driver wants to take over the direction of the vehicle or just keep his hands on the steering wheel. So why can’t it be set to cancel the AP to take over the steering wheel? Because in an emergency, the driver usually cannot remain calm and easily forgets to cancel the AP. If you can’t take over the steering wheel at this time, it will be extremely dangerous. Let’s talk about the incident itself: According to the description of the victim, when the driver started to turn to the left, AP grabbed the steering wheel with him. It also shows that the AP is open when he is on the right. AP grabs the steering wheel with him, indicating that the driver wants to take over the direction control without canceling the AP or turning on the turn signal. Therefore, there is a situation of grabbing the steering wheel. In general, it can be judged that the driver may not be familiar with the AP function, and instead of using conventional operations to cancel the AP, he wants to change the direction directly. However, the AP is still controlling the steering wheel at this time and must exert greater strength to take over. The driver did not know the logic of this operation and did not exert a large force to break the direction, so the right side impact phenomenon occurred. It also shows that the driver lacks good driving habits, because even in this unclear situation, emergency braking is still necessary. In addition, Tesla has its own millimeter-wave radar and ultrasonic radar, and the millimeter-wave radar does not have high recognition accuracy for non-metallic objects. Ultrasonic radar is fine, but the operating range is too short, usually only more than 1 meter. The main road recognition function relies on the camera to complete, if the color and shape of the obstacle are close to the road, it is easy to recognize the error. Regarding Tesla, there are too many control logics that many people find incredible. The reason is not Tesla’s weirdness, but cars with driving assistance functions (such as emergency braking, lane keeping, lane departure obstacle avoidance, etc.). It can only appear in a very small number of models, or appear alone, and the effect is not ideal. So few car owners really experience and often use these functions. In fact, the control logic of these functions is completely reasonable, and now auto companies that develop autonomous driving functions do the same. Don’t believe that the keyboard experts on the Internet are arrogantly analyzing this unreasonable and that is wrong, listen to what the car owner says and understand. People use this feature every day, isn’t it more reliable than someone who has never touched it before?

leexin
5 months ago

To this day, I really admire Tesla’s car owners, who have spent money to buy Musk’s car and use their lives to collect data for Musk. I don’t know if you’ve seen Will Smith’s movie “My Robot” in which the auto-driving Audi was controlled and then tried to kill Smith, which is now being staged on Tesla. But the difference is that the Audi in the movie was controlled to kill the car owner, while the Tesla accident was that Musk knew that there would be a bug, but he didn’t know where it would be. Anyway, just take out the data and say that the autopilot is dead. The number of deaths is less than that of individual driving, so Tesla is fine. Autopilot is such a thing, and I will buy it when I want to cremate it someday.

greatword
5 months ago

I have never been optimistic about the automatic driving of the vision solution. The visual solution, to put it bluntly, is that the machine imitates the way of driving the car, and the camera imitates the driving of the human eye. Seriously, this scheme is not cool at all. The ability of human eyes and brain to perceive the world is simply not comparable to the current cameras and processors. A robot must look like a robot. Computers use binary, not the decimal system that humans are good at and accustomed to using. This is the essence of the difference between robot handling problems and human handling problems. The perception of the world by smart cars should be various radars, scanning devices, sensors, ultra-high-definition cameras, high-precision maps, low-latency networks, Internet of Vehicles (interaction between vehicles on the road), high-precision global positioning, and vehicle regulations. A series of high-end technologies, such as high-performance processors. Only when these series of problems are solved in all aspects from hardware technology, cost, software, etc., can they be qualified to talk about the implementation of autonomous driving. This is the fundamental premise of autonomous driving. It is extremely irresponsible to make a robot drive like a human with just a few cameras. The vision plan is like a person with serious eye problems and a bad brain coming to drive a car. “Looking at the eyes”, “invisible”, “invisible”, and “incomprehensible” are commonplace, especially at night when the light is dim. , Under the environment with great influence of various light sources. If the accident is as described by the driver, it can only be said that the driver trusts Tesla’s so-called autonomous driving too much (Tesla does not dare to call it autonomous driving now, and renamed it to automatic assisted driving), it is obviously caused by inattention Accident. Now, it is just the embryonic stage of autonomous driving technology. All so-called autonomous driving are actually assisted driving. There is still a long, long way to go before human technology can realize true autonomous driving. Therefore, when choosing the so-called “autonomous driving” model, one would rather choose a conservative plan than a radical one. When using the autopilot function, you must always remind yourself: The driver who is driving is a second person with bad eyesight and a pitted brain. I have to look at him carefully. Tesla’s “autonomous driving” is a typical representative of radicals. In terms of publicity, safety is ignored, leading to countless people blindly trusting, and the resulting accidents are no longer one or two. This accident is another typical case.

loveyou
5 months ago

Isn’t Tesla just following Samsung’s old road, exploding, snarling, throwing the pot, being named, and beginning to admit his mistakes, avoiding the most important things, and losing the market. A young man complained when visiting the auto show: “This car really won’t stop!” After being heard by a security guard, he set it up and threw it out. The young man argued: “I didn’t tell which car it was, how can you catch me casually?” “You don’t lie,” the security roared, “I have been a car show security guard for so many years. I will stop which car can’t stop. Don’t know?​

strongman
5 months ago

1. He said that he was driving normally, and AP snatched the steering wheel. The AP must be turned on manually, not automatically. 2. Snatch the steering wheel. I have driven a lot of cars with LCC+ACC. I’m not sure if I don’t know if I break a little finger, but I just break it with an index finger. 3. He swishes over, obviously speeding. To sum up, a typical driver is speeding and dangerous driving and causing death. His uncle is traveling with him. Because he is afraid of pressure from his relatives, he throws the pot out first. Fortunately, he is driving a TSL, which is very easy to shake the pot. Maybe some people don’t like to listen to it, but let me also say, China is such a big country. There are tens of thousands of accidents every day. There are hundreds of people who die in traffic accidents a day. What does it mean to just stare at TSL?

stockin
5 months ago

Tesla made the era of robots come early. The recent Tesla accidents, whether it was Tesla’s robbing of human control of the car in this issue, or the previous Tesla’s braking problem (essentially, Tesla robbed humans of direct control of the brakes. Judge the brake logic for yourself), all show that Tesla is using humans to conduct a real-life robot experiment. Unfortunately, this experiment always has bugs, causing many accidents. Robots cannot hurt people. It is an ideal rule that many people have thought of. However, this rule has always had a problem, that is: we imagine a perfect robot. However, if the robot itself is not perfect, then problems are prone to occur, such as the typical robot judgment problem. In this accident, the Tesla robot robbed humans of the control of the car, theoretically protecting humans. However, the premise for the judgment of the Tesla robot does not exist, that is, Tesla has not found the isolation pier. In fact, this has been repeatedly verified in recent years, that is, Tesla’s identification and judgment system is not omnipotent, and bugs often appear. For example, Tesla ran into an overturned truck on the highway without decelerating because Tesla did not recognize the white car. The video shows that a Model 3 ran into a white container truck that rolled over while driving on the highway. The front of the Model 3 was even inserted into the container, and the Model 3 did not seem to show signs of slowing down before hitting the truck. . Many other situations have happened similarly. Before the machine’s recognition and judgment ability has reached a high or even perfect level, it is very dangerous to give full control of the car to the trip computer. Think of Asimov’s law of robots (the 0 law is supplemented by others): the zeroth law: robots must protect the overall interests of human beings from harm. The first law: Robots must not harm individual humans, or witness that individual humans will be in danger and ignore them, unless this violates the zeroth law of robotics. The second law: the robot must obey the order given to it by the human, except when the order conflicts with the zeroth law or the first law. The third law: Robots must protect their own survival as much as possible without violating the zeroth, first, and second laws. This zeroth law will determine that robots may ignore individuals and form the purpose of judgment for humans. Therefore, the question of killing two people or killing one person, Tesla and other robots will determine that it is killing one person.

11
0
Would love your thoughts, please comment.x
()
x