Hawk Tuah Coin and the Wild West of Online Content

Haliey Welch
hawk coin @hay_welch/Instagram

Last month, Hailey Welch, who became an internet superstar earlier this year after a clip of her discussing her favorite bedroom antics went viral, debuted her latest project: a crypto token known as HAWK. After much fanfare, and heavy promotion to her fans and followers, HAWK hit a massive market capitalization of around $490 million within minutes of launch. Unfortunately for buyers, soon after, the token absolutely crashed in value by roughly 93%. Over 14,000 investors are still holding HAWK tokens, which now have a collective value of around $2 million.

Lots of Hawk Tuah fans obviously lost serious money in these exchanges, which had been directly and enthusiastically promoted across all of Welch’s various social media channels. Nonetheless, in the immediate aftermath of the launch, Welch and the team behind the HAWK token have denied any personal responsibility. The team claims that neither Welch nor her direct collaborators had been given any free tokens or had been the ones quickly selling tokens off post-launch.

The HAWK team claims this is a long-term project that has been negatively impacted by “snipers,” private investors who take unfair advantage of hyped new coins by executing trades at blazingly high speed. (Often these individuals use automated tools or bots to monitor blockchain activity and executive trades at peak efficiency.) But the internet did some additional digging, and fresh allegations suggest that Welch and/or her collaborators may have personally profited off of the HAWK hype and crash.

Data from Bubblemaps, released on X last week, indicates that the HAWK team “pre-sold” many coins to a group of 285 insiders, and these unnamed individuals proceeded to sell off much of their holdings immediately post-launch. In fact, 89 of these wallets sold off 100% of their stake in the HAWK token. (Blockchain data indicates that one wallet alone made $365,000 on 23 transactions on the day that the token launched.) In all, these insider accounts raised a total of $3.3 million from HAWK token sales, and that likely played a significant role in the eventual crash.

Then there’s the matter of fees. In a widely-shared 24 minute deep dive on the HAWK Token, YouTuber Coffeezilla (aka Stephen Findeisen) notes that HAWK sales raised just under $2 million from fees alone. During a X/Twitter Spaces chat with Hailey Welch and some other members of the HAWK leadership team, Findeisen attempted to get some clarification on who specifically received these funds. The HAWK team claimed the fees were designed to prevent the very “crypto snipers” they blame for undermining the project. No clear answers on who received these funds were provided.

The “Wild West” nature of the cryptocurrency market is an intrinsic part of its core appeal, the very concept that differentiates independent, decentralized crypto from fiat currency. Investors who want protection and security can buy government bonds or invest in blue-chip stocks. Those with more appetite for risk can try their hand at new and enticing make-or-break projects like meme coins.

Coffeezilla, reporters from Coindesk and Decrpyt, and anyone else who’s curious can look directly at the blockchain, see all the transactions that took place, and draw their own conclusions.

Back in 2021, Logan Paul famously promoted an NFT-based game called CryptoZoo that, according to a 2023 class action lawsuit brought by players, never actually worked, and was an entirely “fraudulent venture.” In January, Paul announced a $2.3 million program to refund some of the players by buying back their now-worthless NFTs.

Beyond the world of crypto, creators negatively impacting their own fans and followers – spreading misinformation, encouraging them to gamble money they don’t have, selling them moldy cheese snacks, and more – has become something of an industry standard. The websites and apps hosting these creators have little to no voice in terms of basic protections for their users. That’s both because of legal constraints –  Section 230 of the Communications Decency Act relieves tech companies of any liability for third-party content posted to their platforms – and out of profit considerations. YouTube doesn’t want to limit its most popular creators from making videos that bring in tons of eyeballs and ad revenue, and it doesn’t want to limit the reach of their off-platform projects. That’s not good business.

A report released just this week by the Center for Countering Digital Hate (CCDH) found that YouTube’s algorithm was helping to boost videos that could trigger eating disorders or self-harming behavior among teenage viewers. CCDH researchers created a fake profile for a 13-year-old girl who had searched for keywords that frequently come up in discussions about eating disorders. (For example, terms like “ABC diet” or “safe foods.”) 

They then found that this user’s YouTube “Up Next” recommendations were filled with videos about weight loss, diet, and exercise, many of them with unhealthy or inappropriate suggestions. According to CCDH, nearly 2/3rds of the videos that YouTube recommended to this hypothetical user featured “problematic weight loss content” that’s “likely” to push them deeper into an eating disorder.

Even worse, a full 1/3rd of the recommended videos were deemed outright “harmful” by CCDH, promoting or glamorizing eating disorders and bullying people about their weight. Around 50 videos out of the 1,000 scanned by CCDH had content involving self-harm or suicide.

Ethically, these platforms should take responsibility for what content gets served to young impressionable viewers, but in the short term, creators have to simply police themselves.

Tech and media companies seek to maximize ad revenue and time spent on their platforms, two metrics that don’t necessarily align with responsible stewardship and user satisfaction. So many internet destinations – from Google search to the app formerly known as Twitter to Reddit forums and beyond – have degraded over the last few years, clogged with an ever-increasing amount of misinformation, scams, and bad actors that are often difficult to distinguish from valid, well-meaning users. If creators don’t want their own native platforms to similarly degrade, and become increasingly unpleasant or even useless for everyday users, self-policing and ensuring that they’re not exploiting their audience becomes more than the “right thing to do.” It becomes a necessity.

Content for Creators.

News, tips, and tricks delivered to your inbox twice a week.

Newsletter Signup

Top Stories