Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Add release notes for 2.13.0 #6054

Merged
merged 5 commits into from
Jul 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
53 changes: 53 additions & 0 deletions docs/source/guide/release_notes/onprem/2.13.0.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
---
hide_sidebar: true
---

## Label Studio Enterprise 2.13.0

<div class="onprem-highlight">Annotator performance dashboards, potential breaking change for Google Cloud Storage users, feature flag changes</div>

*Jul 02, 2024*

Helm Chart version: 1.4.9

### New features

#### Annotator performance dashboards - Beta

With this release, you will see a new **Performance report** action available from the Organization page.

![Screenshot of Performance Report button](/images/project/user_report.png)

Clicking this takes you to a dashboard focused on annotator performance. This new dashboard is designed to help you manage your team, figure out resource allocation, and save the expense of building custom internal tracking tools.

![Screenshot of the annotator dashboard](/images/releases/2-13-annotator-report.png)

The annotator performance dashboard provides insight into each user’s annotation activity over a period of time. You can see how much time they spent annotating, how many annotations they submitted, and their average time spent per annotation. You can further refine this dashboard by workspace and project.

For more information, see [Annotator performance dashboard](dashboard_annotator) and [Annotator Dashboard Helps Optimize Team Performance](https://humansignal.com/blog/new-annotator-dashboard-helps-optimize-team-performance/).

### Enhancements

- Improved performance on the Projects list page due to improvement on the API level.
- The annotation review workflow has been improved with the following usability enhancements:
- When a Reviewer, Manager, or Admin clicks an annotated task in the Data Manager, they will now be taken directly to Review mode (bypassing the Submit/Update actions). This change is aimed at reducing the number of clicks required for their workflow and improving overall efficiency.
- Previously, the review stream displayed tasks in reverse order as compared to the labeling order. With this release, the review stream order will mirror the labeling stream order to maintain sequential context.

### Breaking changes

- Fixed an issue with Google Cloud Storage when the connection has the **Use pre-signed URLs** option disabled. In these situations, Google was sending pre-signed URLs with the format `https://storage.googleapis.com` rather than sending BLOBs.

With this fix, Google Cloud Storage will begin returning BLOBs/base64 encoded data when **Use pre-signed URLs** is off. This means that Label Studio will start reading data from Google Cloud Storage buckets, which can result in large amounts of data being sent to your Label Studio instance - potentially affecting performance.

### Feature flag changes

- As part of an ongoing effort to streamline our codebase, we have identified a number of seldom-used feature flags. We have marked these feature flags as `stale`, meaning they can no longer be enabled by users. For a full list of all affected feature flags, see https://github.com/HumanSignal/label-studio/pull/5971

### Bug fixes

- Fixed an issue with Redis being unable to connect to SSL.
- Fixed an issue where Redis storage connections were causing errors due to a missing field in the storage form (Storage Title).
- Fixed an issue where connected ML backends were unable to return more than one prediction per task.
- Fixed an issue where annotators were not being prompted to leave a comment when skipping a task, even though the project settings required them to do so.
- Fixed an issue where sometimes actual usernames were being replaced by a generic “Admin” username in the annotation history.

Binary file modified docs/themes/v2/source/images/project/user_report.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.