Close

Developer Experience Survey

Understand precisely what’s holding your developers back and which improvements would yield the highest value for the team.

People working together at the table
List icon
PREP TIME

90m

Stopwatch icon
Run TIME

45m

Group icon
Persons

3+

5-second summary

  • Select your vital signs.
  • Run a developer experience survey and analyze data.
  • Discuss the results and document an action plan to improve developer experience.
WHAT YOU WILL NEED
  • Survey tool
  • Digital collaboration tool
  • Whiteboard
  • Post-its

How to create an effective developer experience survey

Understand precisely what’s holding your developers back and which improvements would yield the highest value for the team.

What is a developer experience survey?

Developer experience is how developers feel about, think about, and value their work. Unlike other quantitative metrics that measure the systems enabling developer productivity, experience can only be measured by asking people qualitative questions about their feelings. Surveys are an effective way to gather sentiment data about the developer experience.

At Atlassian, we believe the most effective approach to measuring developer productivity is to use a combination of quantitative and qualitative data. This combination gives us a more contextual understanding of the friction faced by developers.

Why run the developer experience survey play?

Asking developers how satisfied they are with their ability to be productive within their codebases, the tools they use, and the overall software development lifecycle (SDLC) provides tremendous insights and acts as a means to measure overall progress in improving the day-to-day life of the developer.

You can use this data to identify bottlenecks and prioritize improvements to the developer experience.

When should you run a developer experience survey play?

We recommend running the Developer Experience Survey Play at least twice each year if you are happy with your success metrics but need to maintain a pulse on vital signs, or quarterly if you are actively working on improving your developer experience.

5 benefits of a developer experience survey play

  1. Surveys allow us to measure things that are otherwise unmeasurable.
  2. Survey data provides missing context for quantitative vital signs.
  3. Running the survey on a semi-annual or quarterly basis enables you to track improvements over time.
  4. Surveys provide an opportunity for input directly from each engineer making them feel heard.
  5. Creating rituals around the survey provides an opportunity for teams to discuss the developer experience at least 4 times a year (culture impact).

1. Select your vital signs

Est. time: 30 MIN

To truly understand your team’s developer experience, you need to ask the right questions. At Atlassian, we focus our questions on key vital signs that help us uncover pain points within the developer experience. Vital signs are data points that act as engineering health indicators. Much like your body’s vital signs, they can quickly identify problems in the system.

Vital signs are a crucial component of this entire Play, so before you begin, come to an agreement with your team about which vital signs are important to your operations. We recommend including six to eight vital signs in your own organization-specific survey.

Incorporate our vital signs into your organization-specific survey, or use ours for inspiration and create your own. If a vital sign doesn’t apply, you can remove it from the survey in step two. When in doubt about a vital sign’s relevance, we suggest leaving it in until you’ve run the Play at least once.

2. Run developer experience survey

Est. time: 60 MIN

After you’ve chosen or created vital signs that apply to the developers on your team, develop relevant survey questions that capture the experience across the SDLC. Spend time upfront on survey design, and consider conducting usability testing to ensure the questions are capturing the intended signals.

You may also want to capture additional dimensions, such as area of work (e.g., platform or product), skillset (e.g., frontend or backend), tenure, role level, location, and other demographics. These attributes can be used to slice the data to analyze trends across different cohorts.

You will need to prepare your developers for the survey by communicating with them about the purpose of the survey and how you plan to follow up with survey results. Plan for how you will drive awareness and participation.

Next, invite all of your devs to complete your survey. Set a clear deadline - we recommend at least a week. At Atlassian, we run our survey at the beginning of the last month in each quarter, with the survey period lasting two weeks.

Tip: Driving Survey Participation

As engineering teams can have vastly different engineering practices and tooling, capturing the diversity and variability between teams is important. This is only possible with high levels of participation. We recommend using a mixture of communication techniques, such as Slack (bot messaging) and team communication through managers or leaders. Setting a clear participation goal (we strive for 80%+ at Atlassian) and showing progress during the survey duration enables everyone to drive participation.

3. Analyze survey data and share results with team

Est. time: 30 min

To analyze survey data effectively, begin by organizing the collected responses into a structured format. Next, clean the data by checking for any inconsistencies or missing values, which may skew the results.

Once the data is prepared, employ descriptive statistics to summarize key findings, such as means, medians, and modes, which provide insights into central tendencies. Visual representations, such as charts and graphs, can help illustrate trends and patterns. For deeper analysis, consider using inferential statistics to draw conclusions about the larger population based on the sample data. For example, at Atlassian, we create:

  • Trend data (you will need to run the survey a few times to gather this data, but it is invaluable to understand how you are improving or regressing)
  • Heat maps (compare demographics and teams to find opportunities for collaboration with peer teams or focus areas where your team has a significant delta to peers)
  • Impact data (to help understand what drives the sentiment around your vital signs)
  • Theme analysis (if you choose to include a free-text response question, use AI to find the key themes and add context to your sentiment question)

Finally, interpret the results in the context of the survey's objectives, highlighting significant findings and any potential implications.

4. Discuss results and document an action plan to improve developer experience

Est. time: 15 min

Finally, brainstorm solutions after discussing survey results and identifying the top three most pressing opportunities for improvement. At Atlassian, engineering leaders hold team conversations to collaborate on creating action plans.

Follow-through matters; sharing results with an action plan is an important way for the team to demonstrate to developers that their voice matters and can lead to change, which in turn can ensure high survey completion rates going forward. The survey provides devs with a way to contribute, helping them feel more invested in developer experience outcomes, which in turn often leads to better follow-through and more consistent results.

Tip: Driving Survey Participation

Keeping your survey up-to-date is essential for measuring signals affecting Developer Experience. Reviewing free-text comments for emerging themes and conducting interviews every 3 to 6 months can be a good way to identify gaps in your survey.

Warning: If a vital sign gap is found, review how adding a question may impact the longitudinal data collected by your survey; survey design is crucial.

Follow-up

We recommend scheduling and running the developer experience survey at least twice per year if you are in a healthy state, or quarterly if you are actively prioritizing work to improve the developer experience.

Variations

Other variations can include:

  • Optional add-on questions that expand upon or deep dive into vital signs to gather more granular signals in a particular hotspot.
  • Shorter, more frequent surveys comprised of a minimal question subset to maintain a constant pulse on sentiment between survey periods.
Group of question marks

Still have questions?

Start a conversation with other Atlassian Team Playbook users, get support, or provide feedback.

Other plays you may like
Thumbnail
Goal-setting

OKRs

Define what objectives to achieve and track progress with measurable key results.

Book illo
Decision-making

DACI Decision Making Framework

Assign clear roles to ensure effective collaboration and accountability during the decision-making process.

Book illo
Decision-making

Tradeoffs

Define and prioritize your project variables.

Thumbnail
Goal-setting

Modeling Strategic Focus Areas

Build and align on your organization’s strategic focus areas.

Stay up to date

Get the latest Plays and work life advice when you sign up for our newsletter.

Thanks!

3. Calculate the results 10 MIN

Once everyone’s completed the survey, close it, and review the data.

Next, assign each vital sign an opportunity score. If you have any outliers, call them out in your notes and discuss them with your team. You can use a spreadsheet tool to make your calculations easier, if you’d like.

Here’s how to calculate the opportunity score for each vital sign:

  • First, identify the average importance and the average satisfaction of your vital sign.
    • For example, 8.22 and 5.88, respectively.
  • Next, calculate the difference between the average importance and the average satisfaction.
    • For example, 8.22 - 5.88 = 2.34
  • Finally, if this number is positive, add it to the average importance to find your vital sign’s opportunity score. If the number is negative, your average importance is your opportunity score.
    • For example, 8.22 + 2.34 = 10.56

Opportunity score = importance + max (importance - satisfaction, 0)

Next, take the opportunity score for each of your vital signs and designate a rating:

Tip: MAP OUT YOUR DATA

If it’s helpful to visualize each of your vital signs relative to the others, you can plot your results on a scatter plot.

When to remove a vital sign

If average satisfaction is higher than average importance, the vital sign is probably not very important to your team, or your team is satisfied with it already. In the future, you can replace the vital sign with one you want to watch more closely.

Needs action

15+: Extremely under-served areas to address first.

Improvement needed

10-15: Areas that should be addressed soon.

Good

10 and below: Well-served areas that do not need to be addressed.

We've organized results from a sample survey into a table below.

Sample survey results

Vital sign

Average importance

Average satisfaction

Opportunity score

Results

Sustainable speed to ship

Average importance

6.93

Average satisfaction

4.83

Opportunity score

9.03

Results

  GOOD

Waiting time

Average importance

7.48

Average satisfaction

3.41

Opportunity score

11.55

Results

  IMPROVEMENT NEEDED

Execution independence

Average importance

4.56

Average satisfaction

6.34

Opportunity score

4.56

Results

  GOOD

Ways of working

Average importance

8.3

Average satisfaction

1.33

Opportunity score

15.27

Results

  NEEDS ACTION

External standards

Average importance

2.67

Average satisfaction

5.87

Opportunity score

2.67

Results

  GOOD

Maintenance

Average importance

9.15

Average satisfaction

3.23

Opportunity score

15.07

Results

  NEEDS ACTION

Onboarding

Average importance

3.6

Average satisfaction

9.76

Opportunity score

3.6

Results

  GOOD

Developer satisfaction

Average importance

7.82

Average satisfaction

5.49

Opportunity score

10.15

Results

  IMPROVEMENT NEEDED

Advanced math

An optional way to get more out of your findings is to calculate the satisfaction gap for each vital sign.

When you find the difference between the average importance and the average satisfaction of each vital sign, you’re calculating the satisfaction gap. That is, the difference between how important a vital sign is to your developers compared with how satisfied they are with it. A smaller satisfaction gap indicates that the vital sign is either of low importance and low satisfaction, or high importance and high satisfaction, and so, in both cases, that vital sign is less of a priority. A larger satisfaction gap indicates a vital sign is both highly important to the team and they currently are not satisfied with how the team manages it, so addressing the issue is a high priority.