amazon-research / contextual-attention-nlm

Accompanying code for paper "Attention-Based Contextual Language Model Adaptation for Speech Recognition", submitted to ACL 2021.

Summary
email_034-attachment-send-file-code-cssCreated with Sketch.
Main Code: 1,168 LOC (9 files) = PY (100%)
Secondary code: Test: 0 LOC (0); Generated: 0 LOC (0); Build & Deploy: 0 LOC (0); Other: 160 LOC (5);
Artboard 48 Duplication: 13%
File Size: 0% long (>1000 LOC), 65% short (<= 200 LOC)
Unit Size: 0% long (>100 LOC), 53% short (<= 10 LOC)
Conditional Complexity: 0% complex (McCabe index > 50), 69% simple (McCabe index <= 5)
Logical Component Decomposition: primary (2 components)
files_time

less than a month old

  • 0% of code older than 365 days
  • 0% of code not updated in the past 365 days

0% of code updated more than 50 times

Also see temporal dependencies for files frequently changed in same commits.

Goals: Keep the system simple and easy to change (4)
Straight_Line
Features of interest:
TODOs
1 file
Commits Trend

Latest commit date: 2021-05-21

0
commits
(30 days)
0
contributors
(30 days)
Commits

3

Contributors

1

2021
show commits trend per language
Reports
Analysis Report
Trend
Analysis Report
76_startup_sticky_notes
Notes & Findings
Links

generated by sokrates.dev (configuration) on 2022-01-31