This article is part of the On Tech newsletter. You can sign up here to receive it on weekdays.

The QAnon conspiracy theory, promoting fake health treatments, and promoting violence based on false claims of election fraud have a common thread: Facebook groups.

These forums for people with a common interest can make wonderful communities for avid gardeners in the same neighborhood or for parents whose children have a rare disease. But for years it has also been clear that the groups are charging some people’s tendencies to get into heated online battles, disseminating profound information whether it is true or not, and scapegoating others.

I don’t want to oversimplify Facebook groups and blame every bad thing in the world. (Read my colleague Kevin Roose’s latest column on suggestions for how to target polarization and get people involved in targeted activities.) Mitigation of Facebook is not as straightforward as the company’s critics believe.

However, many of the toxic side effects of Facebook groups are down to the company’s choices. I asked several online communication experts what they would do to reduce the group’s disadvantages. Here are some of their suggestions.

Stop automated recommendations. Facebook has announced that it is taking a temporary break from computerized referrals for people interested in joining political groups. Some experts said Facebook should go further and stop computerized group suggestions entirely.

It’s nice that Facebook suggests a forum about growing roses to someone who writes about gardening. But for years, Facebook’s group recommendations have proven to be easy to manipulate and have made people come up with more and more marginal ideas.

According to a 2016 report in the Wall Street Journal, Facebook’s investigation found that two-thirds of people who joined extremist groups did so on Facebook’s recommendation. Automated group recommendations were one of the ways the QAnon conspiracy theory spread, said my colleague Sheera Frenkel.

Quitting these computerized suggestions is not a silver bullet. But it’s crazy how many times activists and academics have screamed about how harmful referrals are, and Facebook was just tinkering on the sidelines.

Provide more control over private groups. Social media researchers Nina Jankowicz and Cindy Otis have suggested not privately admitting groups over a certain number of members – which means newcomers must be invited and outsiders cannot see what is being discussed – without regular human review of their content.

“A lot of really toxic groups are not searchable and are only available by invitation, and that is extremely problematic,” said Jankowicz.

Jankowicz and Otis have also advocated a more uniform description of the groups and more transparency in the management of the groups. Political discussion groups are sometimes deliberately referred to as “personal blogs” by their hosts in order to avoid the extra attention Facebook pays to political forums.

Target the habitual group perpetrators. Renée DiResta, disinformation researcher at Stanford Internet Observatory, said Facebook needs to be “more determined” against groups that are repeatedly harassed or otherwise violating Facebook rules. Facebook took a few steps in this direction last year.

Jade Magnus Ogunnaike, senior director at the racial justice organization Color of Change, also said Facebook should no longer use contractors to review material on the website. It was fairer to convert these workers into clerical staff, she said, and it could help improve the quality of group surveillance.

Add some … librarians? Joan Donovan, director of research at Harvard University’s Shorenstein Center for Media, Politics and Public Order, has suggested that large Internet companies should hire thousands of librarians to provide people with verified information to counter groups wallowing in incorrect information .

Jeff Bezos likes to say that failure is healthy because people and companies learn from it. But sometimes failure is a consequence of a company’s weaknesses, and it’s not a good thing.

In the past few days, there have been news articles about the total inability of Amazon and Google to develop their own successful video games despite the infinite amount of money and smart people available.

The roots of their failure are complex, but two problems occurred to me: cultural weaknesses and hubris. (And in the case of Amazon, an excess of Bezos’ distilled wisdom in “Jeff isms,” as above.)

Here’s what happened: Google announced this week that it would close its video games group. And Bloomberg News explained the reasons for Amazon’s repeated flops in developing its own powerful video games.

Describing Amazon’s struggles as a reflection of its Amazon-ness, it reported that an obsession with data caused people to lose focus on making games for fun. Executives relying on their company’s expertise forced employees to use Amazon’s game development technology instead of industry standard.

Despite all its successes, Google also has ingrained habits that sometimes make it difficult to break into unknown areas. Technology news publication The Information this week covered the problems facing Google’s business of selling cloud computing technology to businesses.

Google engineers are treated like royalty and it has been difficult to convince them to create rigid three-year product roadmaps that companies will love. The Google Cloud business has struggled with the same basic problem for years – it shows how Google takes into account the prosaic habits of its business customers.

The magic (or annoying) thing about cash-rich superstar companies is that failure can often lead to success. However, the difficulties Amazon and Google face in companies outside of their core competencies are a reminder that sometimes their weaknesses blind them to their weaknesses.