a)Work Log

From Daily Stand-Ups -> ‘Work-Log’, you can check all the activity for each engineer, such as code commits, merge commits, pull requests open, pull requests merged, pull requests comments and pull requests reviews. In order to see these stats, you need to click the bubbles (commits) or rhombus (pull requests).

From this page, you’re able to filter by teams, repos, period and date.

b)Daily Update

From Settings -> ‘Daily Update’, you can check the velocity of your team by comparing previous week impact to the current week impact; daily, you need to check how the red line performs; the red line is the current week impact and the grey line is the last week’s impact. 

On this page, you can also see the work focus of the previous day, and if there were any authors who didn’t check-in code.

From this page, you’re able to filter by team or repository.

c)Developer Summary

From the One-to-One -> ‘Developer Summary’ page, you will see an individual report for each engineer. You can view a summary of his work from the last period, the work focus, a summary graph, the commits’ risk, and all the commits produced. From this page, you’re able to filter by repos and date range.

If you scroll down, you will be able to see all the summary graph, the commits’ risk, all the commits in a timeline, their focus and risk.

d)Retrospective

From Reports -> ‘Retrospective’, you’re able to see how much of the code was productive and how much of it was churn. On the right side of the page, you will see the amount of productive throughput, raw throughput, and the efficiency rate. From this page, you’re able to filter by teams, repos, and time frame.

e)Project Timeline

From Reports -> ‘Project Timeline’, you’re able to see high-level stats, such as Total Impact, Code Volume, Commits Volume, Commits per Active Day. From this page, you’re able to filter by teams, repos, period and date.

If you scroll down, you can see the work trend in three different views: Absolute, Relative, Stacked.

If you scroll to the bottom of the page, you can view the top performers for each metric (New Work, Legacy Refactor, Help Others and Churn).

f)Time Card

From Daily Stand-> ‘Time Card’, you’re able to see at what hour the engineers commit often, on the top side of the page you will see the most active hour (most commits) for the entire project, and if you scroll down you will see the stats for each engineer, if you hover over the square you will also see the number of commits for that hour. From this page, you’re able to filter by teams, repos, period and date range.

g)Dashboard

The Dashboard page is the most commonly used feature because it includes all the raw stats for the project and the engineers. On the top side, you will see the overall Impact, Efficiency, Commits/day and Active Days. When you scroll down you will see the total Impact, Code Volume, Commits Volume and Commits per Active Day. From this page, you’re able to filter by teams, repos, and date range.

If you scroll down, you will see the engineers list, along with their stats for the selected time frame. By default, our core metrics (New Work, Legacy Refactor, Help Others, and Churn) are displayed as percentages. You can enable Waydev to show absolute values(in LoC) for each engineer in the Dashboard. You can do this by checking 'Show Absolute Values'. 

h)Compare

h.1. Team Compare

The “Team Compare” feature from the Reports submenu allows you to compare your teams’ performance indicators. You can choose which teams to compare, which repositories to compare and which period will be analyzed. The performance indicators analyzed are: “Impact”, “Efficiency”, “Active Days”, “Commits/ Day”, “New Work”, “Legacy Refactor”, “Help Others” and ”Churn Code”. On the right side of the page you will see how the 2nd team performed related to the 1st team.

h.2. Developer Compare

The “Developer Compare” feature from the One-to-One submenu enables you to compare your developers’ performance indicators. You can compare the work of an engineer to a previous period or compare it to another engineer. You can choose which developers to compare, which repositories to compare and which period will be analyzed.

 

i)Code Review

The Review & Collaborate features provide valuable insights into organizational behavior with respect to how individuals are collaborating with their peers during the code review process. 

i.1. Review Workflow

Review Workflow shows all the Pull Requests that were open at any time during the selected period. PRs opened before the selected period are included if they were open during the selected period. Use this report as your starting place for a birds-eye view of all Pull Request activity. 

You can select which team’s review workflow to see, which repositories’ review workflow to see, what period should the review workflow span to, as well as what type of workflow reviews to see (open, merged, closed or all). 

The bars represent the time it took for a pull request to close, while the numbers on the right side of the page are the pull request’s ID. The bubbles indicate a follow-on commit, while the half bars indicate the comments. If you click on a bar, details about that particular pull request will pop up. You can see the pull request ID, the engineer that created the pull request, when it was opened and when was it merged, how much time passed until the first comment, the work level, the number of commits, the number of comments, the number of reviews and its status. You will also see a timeline of the follow-on commits and comments on the pull request. If you click on a commit title, the commit page from the Git provider will open in a new window.

i.2. Review Collaboration

Review Collaboration shows code collaboration stats between the submitters and the reviewers. Engineers can play the role of both submitters and reviewers. You can select which team’s review collaboration stats to view, which repositories to be analyzed and what period to analyze. 

Submitter Metrics refer to “Responsiveness”, “Comments Addressed”, “Receptiveness”, “Unreviewed PRs” and also to the number & type of comments. These metrics will be addressed at 9. f) Code Review Metrics. 

Reviewer Metrics refer to “Reaction Time”, “Involvement”, “Influence” and “Review Coverage” and also to the number & type of comments. These metrics will also be addressed at 9. f) Code Review Metrics. 

The Sharing Index offers a daily overview of how broadly information is being shared amongst a team by looking at who's reviewing whose PRs. If you hover over a column in the chart, it will show you the number of PRs, the number of Active Reviewers and Submitters, as well as the Sharing Index.

Sharing Index metrics are:

PRs is the total number of PRs that were reviewed;

Sharing Index measures how broadly information is being shared amongst a team by looking at who is reviewing whose PRs;

Active Reviewers is the count of active users who actually reviewed a PR in the selected time period;

Submitters is the total number of users who submitted a PR in the selected time period.

If you hover over a Submitter/ Reviewer profile picture or name, several lines that point to another profile might show up. If you hover an engineer profile on the Submitter section, several lines may appear, pointing to other engineers in the Reviewer section. In the example before, you’ll see that Eric Allan’s lines point to Elton Jobs and Eliott Brown. This means that these two engineers reviewed his pull request/s. If you hover over an engineer profile on the Reviewer section, the lines that may show up point to the engineer/s that had their pull request/s reviewed by the engineer you hover over his profile. If no lines appear, it means that the engineer didn’t participate in the code collaboration.

i.3. PR Resolution

PR Resolution shows a time-to-close work trend graph for the selected period as well as additional code collaboration metrics. You can select which team’s PR Resolution to view, which team’s repositories PR Resolution to view as well as the period to analyze. 

There are six metrics that comprise the PR Resolution report: “Time to Resolve”, “Time to First Comment”, “Follow-on Commits”, “Reviewers”, “Reviewer Comments” and “Avg. Comments per Reviewer”. These metrics will be addressed at 9. f) Code Review Metrics.

 

The Fundamentals feature gives you the ability to see your team's performance across the fundamental productivity metrics for both submitters and reviewers.

i.4. Submitter Fundamentals provide a bigger picture regarding the submitters metrics. You can select which team's performance to view, which repositories' contribution to analyze and what period to inspect. Each metric has its own average value which you can see in the tab view or on the left side of the chart.

This tab analyzes the Responsiveness (the average time it takes to respond to a comment with either another comment or a code revision). Click on the metrics tabs (Responsiveness, Comments Addressed, Receptiveness and Unreviewed PRs) to scroll through them. Each metric has its corresponding chart.

 

 

If you hover over a column in the chart, you'll see the metric's value for that particular period.

 

This tab analyzes the Comments Addressed metric (the percentage of Reviewer comments that were responded to with a comment or a code revision). 

 

This tab evaluates the Receptiveness metric (the ratio of follow-on commits to comments).

 

This tab investigates the Unreviewed PRs metric (the percentage of PRs submitted that had no comments).

 

i.5. Reviewer Fundamentals offers an overview regarding the reviewers metrics. You can select which team's performance to view, which repositories' contribution to analyze and what period to inspect. Each metric has its own average value which you can see in the tab view or on the left side of the chart.

This tab analyzes the Reaction Time (the average time it took to respond to a comment) metric.

 

This tab analyzes the Involvement (the percentage of PRs a reviewer participated in).

 

This tab analyzes the Influence (the ratio of follow-on commits to comments made in PRs).

 

This tab analyzes the Review Coverage (the percentage of PRs reviewed).

j) Reports History

The ‘Reports History’, under the Reports submenu, allows you to create and download weekly/ monthly historical reports regarding your teams' accomplishments and work focus.

 

 

You can sort reports by team and period. On the rightmost side of the page you can either view the report or download it as PDF.

 

Did this answer your question?