While X (formerly Twitter) and Reddit have shut down API access with high paywalls, Discord is taking the opposite approach. A few weeks ago, Discord allowed US developers to sell their apps to Discord users in a centralized hub, and today that ability is being extended to developers in the UK and Europe.
Developers who create apps for Discord – from mini-games to generative artificial intelligence tools to moderator bots – get 70 percent of sales, with the remaining 30 percent going to Discord’s platform fees.
“We’ve had an open API since we opened our platform in 2015, and we knew the benefits from day one,” says Mason Sciotti, senior product manager for the Discord ecosystem.
When Sciotti started Discord, his team consisted of just himself and two engineers. Now, Discord has more than 100 people building products for external developers.
“Sometimes it’s still hard for me to imagine there are more of us than the three we started with,” he told TechCrunch.
According to Discord, the platform hosts more than 750,000 third-party apps that are used by more than 45 million people each month. Currently, developers can monetize their apps by selling subscriptions, but the platform plans to offer tips and one-time purchases in the future.
In order to sell access to an app, developers must meet certain requirements and be in good standing with Discord. But to proactively monitor the open API system for unscrupulous participants, Discord has created a developer compliance team, which includes more than 100 new employees who have joined Sciotti’s organization since he started in 2017.
“We have a number of automated checks that happen at different stages, but the main ones are the verification processes before an app can grow,” he told TechCrunch. “Especially when it comes to data privacy, we want to make sure that users understand that by using [the app], they’re giving their consent.”
On the safety side, Discord also announced the launch of Teen Safety Assist, an initiative aimed at keeping teens safe on the platform.
The first two features of this initiative are safety alerts and sensitive content filters. If a teen receives a direct message from someone for the first time, Discord can send them a safety alert to double-check whether to reply to that person. The teen will also be directed to security tools that will show them how to block the user or update their privacy settings if necessary. And if Discord deems certain content in private messages to be sensitive, it will automatically blur them out, so the user will have to click on the image if they really want to view it. This feature, available optionally for adult users, is similar to a new measure that Apple added in iOS 17.
Discord is also launching an alert system that allows users to monitor their account status more transparently.
Is it okay to use runBlocking?
In this video I’ll talk about when it’s fine to use the runBlocking function from Kotlin coroutines and when you...
Mobile App Development Best Practices – 07.12
KSP2 Preview, Mastering in SwiftUI, How to implement gamification and more!
Gemini is the new foundation for artificial intelligence in Android
Foundation models are trained on a variety of data sources to create artificial intelligence systems that can adapt to a...
Google has released AlphaCode 2 based on Gemini
Google today, along with its Gemini artificial intelligence model, unveiled AlphaCode 2, an improved version of the AlphaCode code generator...
ColorfulX – Metal for crafting multi-colored gradients
ColorfulX is an implementation using Metal for crafting multi-colored gradients. ColorfulX Platform UIKit and AppKit platforms are generally supported. Due to MTKView not...
Mobile App Development Best Practices – 06.12
Power of enums, A New Foundation for AI on Android, developer dogmas and more!