AsyncAPI Conference

London Edition

22nd - 24th of September, 2025 | London, UK

37 days until the end of Call for Speakers

2025 Google Summer of Code: Phase 1

Azeez Elegbede

·10 min read

At AsyncAPI, Summer means one thing: Google Summer of Code

Every year, we eagerly anticipate this season, not just because of the program itself, but because of what it represents for our community. GSoC isn't just about the stipend contributors receive for their participation. For us, it's about the doors it opens, the chance to welcome new developers, foster fresh perspectives, and nurture contributors who often go on to become long-term contributors to the project.

Participating in Google Summer of Code is an opportunity to invest in mentorship, give back to the AsyncAPI community, and continue building the future of event-driven APIs, one contributor at a time.

How We Prepared

One of the most challenging aspects of participating in Google Summer of Code is getting accepted as a mentoring organization. At AsyncAPI, we recognized that increasing our chances of being selected would require early and deliberate planning, so we began preparing well before Google announced the official program dates.

Here’s how we laid the groundwork for a successful GSoC 2025 first phase:

  • Kicking Off with Community Discussion: We initiated an open GitHub Discussion to serve as the central hub for all things GSoC-related. This thread became our base for collaboration and coordination throughout the application process. It helped us engage the community early, keep everything transparent, and foster involvement from contributors and mentors.

  • Collecting Project Ideas: Identifying meaningful and impactful projects is a key part of the GSoC application. Through the GitHub Discussion, we invited community members to propose project ideas they were excited about. This resulted in 15 incredible suggestions, including 9 solid proposals that had the potential to evolve into full-fledged GSoC projects.

  • Calling for Mentors: Mentors are the heart of a successful GSoC experience. Rather than manage a separate process, we used the same discussion thread to invite experienced contributors and maintainers to step forward as potential mentors. This approach helped us keep all conversations in one place and encouraged mentors to engage with idea proposals, provide feedback, and collaboratively shape the project scope.

  • Curating and Centralizing Ideas: From the 15 suggestions we received, we curated 9 project ideas that best aligned with AsyncAPI’s community goals and roadmap. These were published in our community repository under the mentorship directory, as seen here. This centralized format made our GSoC application seamless because we simply shared the link with program organizers, making it easy for them to review our proposals.

  • Hosting a GSoC Livestream: After being formally accepted as a mentoring organization, we hosted an introductory livestream for potential GSoC contributors. In this session, we walked students/contributors through everything they needed to know, from selecting the right project and understanding the application timeline, to crafting strong proposals and making meaningful contributions to AsyncAPI.

What We Failed To Prepare For

After participating in Google Summer of Code for over three years, we thought we’d seen it all, but 2025 brought a new kind of challenge we weren’t fully prepared for.

You might think: “What could possibly catch a seasoned community like AsyncAPI off guard?”- Well, the moment the official list of mentoring organizations was published, we experienced an overwhelming influx of enthusiastic contributors eager to participate in the program through our organisation.

At first, this sounds like a good problem to have, and in many ways, it is. However, the intensity and desperation from some contributors to secure a spot in GSoC led to a wave of behavior that disrupted the community’s usual flow. Some examples included:

  • Submitting multiple pull requests in a short time without meaningful contributions
  • Relying on AI tools to generate code without understanding the problem
  • Mass-creating low-quality issues across various AsyncAPI repositories
  • Spamming our Slack channels with repetitive and irrelevant messages
  • Flooding mentors’ and maintainers’ direct messages with irrelevant or insistent requests

This sudden surge, along with the behavior that accompanied it, was unexpected, creating confusion, noise, and stress for mentors, maintainers, and contributors alike.

How We Responded

Thankfully, our Code of Conduct Committee stepped in quickly and took action to ensure the community remained a respectful and engaging space for everyone. Here’s how we handled the situation:

  • Temporarily disabled the #general Slack channel to stop the wave of spamming and allow things to cool down.
  • Removed spammy or inappropriate messages from public channels and sent firm but fair warnings to those involved.
  • Monitored GitHub activity to close or remove issues and pull requests that didn’t meet our contribution standards.
  • Shared clear guidelines on how to contribute meaningfully, behave respectfully, and follow our community’s Code of Conduct.

While this was the first time we experienced something like this at scale, it served as a valuable learning opportunity. We now understand the importance of having proactive communication, stronger onboarding for newcomers, and safeguards in place ahead of time, especially during high-visibility programs like GSoC.

As we look ahead, we’re already working on ways to better manage the influx of contributors in the future, not by discouraging enthusiasm, but by channeling it productively.

Our Selection Process

This year, AsyncAPI received a record-breaking 204 GSoC applications across all proposed project ideas, our highest ever. The volume and quality of proposals were incredibly impressive, and we honestly wished we had more slots to accommodate the many promising contributors.

However, with only 7 project slots officially allocated to us (out of 9 submitted ideas), we had to make thoughtful, strategic decisions during the selection process.

To keep things fair and efficient, we divided our review process into two main phases:

Admin Review Phase

Given the large number of applications, we knew some wouldn’t meet the basic criteria. To avoid overwhelming mentors with low-quality or off-topic proposals, our organization admins conducted an initial screening of all applications.

This phase helped streamline the process by filtering out proposals that clearly didn’t meet our standards. Here are the key rejection criteria used during this phase:

  • No prior open-source contributions: We prioritized applicants who had engaged with the community and shown initiative through prior contributions.
  • Likely written using AI tools (LLMs): Proposals that lacked depth, originality, or contained generic phrasing common to AI-generated content were filtered out.
  • Duplicate or near-identical submissions: We noticed a number of applications with very similar language and structure, often submitted across different projects.
  • Misaligned proposals: Applications that didn’t align with any of our published project ideas were disqualified early on.

Only proposals that passed this initial filter were passed on to mentors for detailed contextual and technical review.

Mentor Review Phase

Once the initial screening was completed, proposals that had been refined and were now ready for review were left on the proposals dashboard for mentors' in-depth review.

Each mentor reviewed applications related to their project ideas, looking for contributors who understood the goals and demonstrated genuine interest in solving the problem. This phase was all about finding the right contributors who were capable of collaboration, committed, and passionate about open source.

Mentors evaluated applications based on the following criteria:

  • Clarity and completeness of the proposal: Strong applications clearly outlined deliverables, milestones, timelines, and showed evidence of understanding the project's problem statement.
  • Quality of contributions: Contributors who had already submitted meaningful pull requests, engaged in issue discussions, stood out.
  • Community engagement: Applicants who were active on Slack, asked insightful questions, or contributed positively to the community made a strong impression.
  • Alignment with project vision: Ultimately, mentors looked for contributors whose goals aligned with the purpose and direction of the project, contributors who could be seen as long-term community members beyond GSoC.

After reviewing, mentors shortlisted top candidates for each project and collaborated with the org admins to finalize the contributors we would accept for GSoC 2025.

Rejection Turns Disruptive

Even with a fair selection process by the mentors, the announcement of selected contributors didn’t go entirely as planned. After the results were published, a few applicants who didn’t make it into the program responded poorly and questioned the process in unprofessional and disruptive ways. Some of the issues we faced included:

  • Hostility Toward Mentors: Some rejected applicants accused mentors of favoritism, simply because they couldn’t handle the outcome maturely. Mentors evaluated every proposal based not only on its quality but also on the contributor’s prior engagement and meaningful contributions to the project. These baseless accusations were both unfair and disheartening for mentors who volunteered their time to support the community.

  • Impersonation and Harassment: A few individuals went so far as to create fake accounts to discredit accepted contributors, attempting to diminish their hard work and efforts. This behavior was both appalling and harmful, and it caused some accepted contributors to doubt themselves and cast a negative light on the spirit of open source collaboration.

Responding to Harassment Incidents

Thankfully, members of the Code of Conduct Committee stepped in quickly and handled the situation with care and professionalism. They investigated the incidents, identified some of those responsible, and gave them the opportunity to issue a public apology for their actions, recognizing the immaturity and harm caused or face appropriate consequences.

Members of the Code of Conduct Committee personally reached out to the affected contributors and mentors to offer heartfelt apologies and provide reassurance, ensuring they felt supported and valued in the aftermath of the incident.

At AsyncAPI, the Code of Conduct is at the heart of our community values. Everyone is expected to respect and uphold it, and we do not tolerate actions that threaten the well-being of others. The CoC team will publish a follow-up article to provide more context from this incident.

Rejection is tough, we understand that. But it should never be a reason to attack others or question their worthiness.

At AsyncAPI, we value growth, openness, and community support, not finger-pointing or toxicity.

Wrapping Up

Selecting contributors is never easy, especially when the applicant pool is packed with passionate contributors eager to make an impact. However, by combining a streamlined review process with community values, we were able to make informed decisions that we're confident will lead to successful project outcomes and long-term contributor growth.

We're incredibly excited to work with this year's cohort and can't wait to see what we achieve together as a community.

Big congratulations to everyone selected, we’re thrilled to have you on board and excited about what lies ahead.

Meet the 2025 AsyncAPI Contributors