This is, I think, a major problem with this technology. It's not just the owner/driver of the Tesla, who voluntarily made the decision to be a guinea pig for FSD development, who's affected, but other drivers on the road who did not sign up for it. All these cars that were involved in the accident (and others), at best having their holiday ruined and having to deal with insurance and getting a replacement car, which may not be as good as the one they had, and at worst suffered injuries or fatalities, not to mention all the other people delayed by the resulting congestion, are dealing with these repercussions despite many/most of them not having Teslas themselves and not wanting to be beta testers for a company so it can make money at their expense. Perhaps it's time lawmakers require self-driving modes be limited to roads with minimal traffic, i.e. if the car's numerous sensors detect more than a couple cars around it, FSD will not activate or will deactivate if already engaged (obviously with sufficient warning).