• Loading stock data...
Technology Featured World

The economics of social media

The growing interest in regulating the market power and influence of social media platforms has been accompanied by an explosion in academic research. This column synthesises the research on social media and finds that while it has made dramatic progress in the last decade, it mostly focuses on Facebook and Twitter. More research is needed on other platforms that are growing in usage, such as TikTok, especially since these platforms tend to produce different content, distribute content differently, and have different consumers. Studying these changes is crucial for policymakers to design optimal regulation for social media’s future.

The Digital Markets Act (Scott Morton and Caffarra 2021) and the Digital Services Act highlight the growing interest in regulating the market power and influence of social media platforms. This heightened policy interest, paired with the explosion in academic research studying social media (pictured in Figure 1), generates a demand for a synthesis of the rapidly expanding literature in order to help guide the many policy debates. In a recent paper (Aridor et al. 2024a), we synthesise and organise the literature around the three stages of the life cycle of social media content: (1) production, (2) distribution, and (3) consumption.

Figure 1 Social media research in economics

Figure 1 Social media research in economics
Figure 1 Social media research in economics
Source: Aridor et al. (2024)

Production of content

Social media platforms rely on user-generated content to attract users. Unlike traditional media that can directly shape content through editorial processes, social media platforms must rely on platform design – features, incentives, and rules – to influence content production. The challenge for platforms is to incentivise the creation of content that attracts user engagement and advertisers, while deterring the creation of harmful content such as misinformation and hate speech.

There is evidence that the production of content responds to different types of incentives. Non-monetary incentives such as peer awards or feedback (including badges, reactions, likes, and comments) have been shown to moderately increase the amount of content produced in the short run (Eckles et al. 2016). While monetary incentives could theoretically crowd out prosocial motives, the literature has also found strong positive effects of monetary incentives, such as ad-revenue sharing programmes, on content creation (Abou El-Komboz et al. 2023). As opposed to quantity, the quality of content produced – proxied, for instance, by the subsequent number of likes received – seems relatively more difficult to influence. Non-monetary incentives tend to have small effect sizes on quality (Zeng et al. 2022, Srinivasan 2023) and the evidence for monetary incentives is mixed (Sun and Zhu 2013, Kerkhof 2020). 

Due to its social consequences, a policy-relevant dimension of content quality is whether it contains misinformation or ‘toxic content’ (e.g. hate speech). In terms of misinformation, a vast literature studies several types of interventions that seek to deter the production – mostly the re-sharing – of false articles, while keeping constant or even increasing the sharing of truthful information (Kozyreva et al. 2022, Pennycook and Rand 2022, Martel and Rand 2023). When comparing across interventions, nudging or prompting users to think about the prevalence of misinformation (Guriev et al. 2023) and digital literacy campaigns that train users to identify emotional manipulation (Athey et al. 2023) seem to be particularly effective. In terms of toxic content, reducing users’ exposure to toxicity (Beknazar et al. 2022) and some types of counterspeech – messages that reproach the producers of toxic content (Munger 2017) – have been found to deter the production of this type of content with small effect sizes. ‘Harder’ sanctions such as post deletions (Jiménez Durán 2022) tend to have null or at best small effect sizes.

Distribution of content

After content is produced, platforms distribute it to users. The distribution of content could be affected by users’ social networks and the platforms’ algorithms. There is an ongoing debate on whether and how to regulate the content that algorithms promote and downrank. Specifically, there is a concern that by promoting like-minded or low-quality content, algorithms may distort beliefs or polarize users (Aral 2021, Campante et al. 2023). The best evidence on this topic comes from Facebook. Based on both experimental variation (Levy 2021) and internal data (Gonzlez-Bailon et al. 2023), there is growing evidence that Facebook’s algorithms tend to promote like-minded content, though the effects are still being debated (Messing 2023). In terms of content quality, Facebook’s algorithm may increase the amount of uncivil content but also decrease exposure to untrustworthy accounts (Gonzalez-Bailon et al. 2023, Guess et al. 2023). These results are consistent with social media platforms trying to maximise engagement, while perhaps downranking specific posts due to other incentives, such as the platforms’ reputation. Other concerns regarding the algorithm have received less support in the literature. For example, YouTube’s recommendation system does not seem to drive users into extreme rabbit holes (Hosseinmardi et al. 2021, Chen et al. 2023).

In addition to distributing organic content, platforms distribute ads to users. In contrast to traditional advertising, ads on social media can accurately target users based on various characteristics and thus are especially valuable (Gordon et. al 2023; Tadelis et. al 2023). A key policy debate with regard to social media advertisements is how to balance the trade-off between the consumer welfare gains from privacy and the dependence of firms on advertising revenues. On the one hand, personal data are clearly valuable for firms. Wernerfelt et al. (2022) find that removing access to off-platform data would increase median acquisition costs for Facebook advertisers by 37%, and Aridor et al. (2024b) find that Apple’s App Tracking Transparency policy – which allowed consumers to opt out of sending this data to applications – led to significant revenue losses for Facebook-dependent direct-to-consumer firms. On the other hand, consumers may highly value maintaining the privacy of their data. Lin et. al (2023) elicit incentive-compatible valuations for consumers’ data and find that the distribution of privacy preferences is heavily skewed and that consumers most value protecting the privacy of their friend network and posts on the platform.

Consumption of content

Consumers allocate their time between consuming content served by the platform and off-platform activities. Their choices are influenced by consumption spillovers where others’ consumption choices influence how people use social media, habit formation where consumption today makes people want to use more in the future, and self-control problems where people use social media more than they would like to (Eckles et al. 2016, Allcott et al. 2020, Allcott et al. 2022, Aridor 2023).

These choices affect the wellbeing of consumers. Experiments eliciting how much users need to be paid to stop using social media find that users highly value its access (Brynjolfsson, Collis, and Eggers 2019; Brynjolfsson et al. 2023). However, Bursztyn et al. (2023) point out that nonusers could derive negative utility from others’ social media usage and find evidence for negative consumer welfare once this spillover to non-users is accounted for. This explanation is consistent with empirical evidence suggesting that social media has adverse effects on subjective wellbeing and mental health (Allcott et al. 2020, Mosquera et al. 2020, Braghieri et al. 2022). Importantly, these results do not imply that consumer welfare is negative at every level of social media consumption; some level of social media use may be beneficial.

Social media consumption can also have both positive and negative aggregate impacts. On the positive side, social media has been shown to increase news knowledge and facilitate protest in democracies (Fergusson and Molina 2021, Guess et al. 2023a). On the flip side, social media has been linked to beliefs influenced by misinformation and offline hate crimes (Allcott and Gentzkow 2017, Müller and Schwarz 2021, Jiménez Durán et al. 2022). The evidence on polarisation and voting is more mixed and context-dependent (Levy 2021, Garbiras-Díaz and Montenegro 2022, Guess et al 2023a, 2023b, Nyhan et al. 2023, Fujiwara et al. forthcoming). These effects operate through several channels, including social media as a platform for exposure to persuasive content, facilitation of coordinated actions, and influence on people’s perceptions of others.

Beyond looking at on-platform and off-platform behavior, recent research has studied consumers’ substitution patterns across platforms. Amid concerns that the market for social media applications has become too concentrated, measuring substitution patterns is crucial for assessing the degree of market concentration. There is evidence that consumers substitute not only to other social media apps, but also to communication apps and non-digital activities (Collis and Eggers 2022, Aridor 2023). 

Concluding remarks

In this column, we show that research on social media has made dramatic progress in the last decade. However, social media is rapidly changing, both in terms of the platforms used and the content produced, distributed, and consumed. Figure 2 shows that Facebook remains the most dominant platform, but that it faces competition from newer platforms. The figure also shows that academic research mostly focuses on Facebook and Twitter. More research is needed about other platforms that are growing in usage, such as TikTok, especially since these platforms tend to produce different content (e.g. more videos), distribute content differently (rely on their algorithm and not on one’s social network) and have different consumers (e.g. more content consumed by teenagers). Studying these changes is crucial for policymakers to design optimal regulation for social media’s future.

Figure 2 Platform representation in the economics literature

Figure 2 Platform representation in the economics literature
Figure 2 Platform representation in the economics literature
Source: Aridor et al., (2024)

Source : Voxeu



About Author

Leave a comment

Your email address will not be published. Required fields are marked *

You may also like


Has the Digital Markets Act got it wrong on app stores?

Apple’s iPhone and Google’s Android mobile operating system dominate the smartphone market. The two companies also control the app stores
Business Technology

How to fix the European Union’s proposed Data Act

The draft European Union Data Act, proposed by the European Commission in February 2022, aims to fill a big gap in