The Code Review features provide valuable insights into organizational behavior with respect to how individuals are collaborating with their peers during the code review process. 

i.1. Review Workflow

Review Workflow shows all the Pull Requests that were open at any time during the selected period. PRs opened before the selected period are included if they were open during the selected period. Use this report as your starting place for a birds-eye view of all Pull Request activity. 

You can select which team’s review workflow to see, which repositories’ review workflow to see, what period should the review workflow span to, as well as what type of workflow reviews to see (open, merged, closed or all). 

The bars represent the time it took for a pull request to close, while the numbers on the right side of the page are the pull request’s ID. The bubbles indicate a follow-on commit, while the half bars indicate the comments. If you click on a bar, details about that particular pull request will pop up. You can see the pull request ID, the engineer that created the pull request, when it was opened and when was it merged, how much time passed until the first comment, the work level, the number of commits, the number of comments, the number of reviews and its status. You will also see a timeline of the follow-on commits and comments on the pull request. If you click on a commit title, the commit page from the Git provider will open in a new window.

i.2. Review Collaboration

Review Collaboration shows code collaboration stats between the submitters and the reviewers. Engineers can play the role of both submitters and reviewers. You can select which team’s review collaboration stats to view, which repositories to be analyzed and what period to analyze. 

Submitter Metrics refer to “Responsiveness”, “Comments Addressed”, “Receptiveness”, “Unreviewed PRs” and also to the number & type of comments.

Reviewer Metrics refer to “Reaction Time”, “Involvement”, “Influence” and “Review Coverage” and also to the number & type of comments.

The Sharing Index offers a daily overview of how broadly information is being shared amongst a team by looking at who's reviewing whose PRs. If you hover over a column in the chart, it will show you the number of PRs, the number of Active Reviewers and Submitters, as well as the Sharing Index.

Sharing Index metrics are:

PRs is the total number of PRs that were reviewed;

Sharing Index measures how broadly information is being shared amongst a team by looking at who is reviewing whose PRs;

Active Reviewers is the count of active users who actually reviewed a PR in the selected time period;

Submitters is the total number of users who submitted a PR in the selected time period.

If you hover over a Submitter/ Reviewer profile picture or name, several lines that point to another profile might show up. If you hover an engineer profile on the Submitter section, several lines may appear, pointing to other engineers in the Reviewer section. In the example before, you’ll see that Eric Allan’s lines point to Elton Jobs and Eliott Brown. This means that these two engineers reviewed his pull request/s. If you hover over an engineer profile on the Reviewer section, the lines that may show up point to the engineer/s that had their pull request/s reviewed by the engineer you hover over his profile. If no lines appear, it means that the engineer didn’t participate in the code collaboration.

i.3. PR Resolution

PR Resolution shows a time-to-close work trend graph for the selected period as well as additional code collaboration metrics. You can select which team’s PR Resolution to view, which team’s repositories PR Resolution to view as well as the period to analyze. 

There are six metrics that comprise the PR Resolution report: “Time to Resolve”, “Time to First Comment”, “Follow-on Commits”, “Reviewers”, “Reviewer Comments” and “Avg. Comments per Reviewer”.

The Submitter and Reviewer Fundamentals give you the ability to see your team's performance across the fundamental productivity metrics for both submitters and reviewers.

i.4. Submitter Fundamentals provide a bigger picture regarding the submitters metrics. You can select which team's performance to view, which repositories' contribution to analyze and what period to inspect. Each metric has its own average value which you can see in the tab view or on the left side of the chart.

This tab analyzes the Responsiveness (the average time it takes to respond to a comment with either another comment or a code revision). Click on the metrics tabs (Responsiveness, Comments Addressed, Receptiveness and Unreviewed PRs) to scroll through them. Each metric has its corresponding chart.

If you hover over a column in the chart, you'll see the metric's value for that particular period.

This tab analyzes the Comments Addressed metric (the percentage of Reviewer comments that were responded to with a comment or a code revision). 

This tab evaluates the Receptiveness metric (the ratio of follow-on commits to comments).

This tab investigates the Unreviewed PRs metric (the percentage of PRs submitted that had no comments).

i.5. Reviewer Fundamentals offers an overview regarding the reviewers metrics. You can select which team's performance to view, which repositories' contribution to analyze and what period to inspect. Each metric has its own average value which you can see in the tab view or on the left side of the chart.

This tab analyzes the Reaction Time (the average time it took to respond to a comment) metric.

This tab analyzes the Involvement (the percentage of PRs a reviewer participated in).

This tab analyzes the Influence (the ratio of follow-on commits to comments made in PRs).

This tab analyzes the Review Coverage (the percentage of PRs reviewed).

Did this answer your question?