Labs hiring

We hire slowly. These three sections describe how work actually feels before you send a note—no stock “culture” paragraphs.

Dashboard lab sessions

Mentors run twice-weekly live blocks where learners screen-share failing charts. You will speak aloud while typing, narrate rollback plans, and annotate memos in shared drives. Noise-canceling headsets are non-negotiable.

  • Prepare three intentional break scripts per cohort.
  • Log incidents with timestamps even when nothing catches fire.
  • Swap critique styles with peers mid-season to avoid monotone feedback.

SQL practice bank

Our question bank is versioned like code. Coordinators refresh seeds, verify EXPLAIN plans still match learning goals, and retire questions that encourage lazy DISTINCT abuse.

  • Maintain anonymized exports with documented grain.
  • Tag difficulty with honest pain scores, not marketing stars.
  • Pair every new question with a sample “good enough” answer for mentors.

Reporting workflow map

Below is the diagram every analyst mentor should be able to sketch from memory—raw extract to stakeholder memo, with explicit human checkpoints.

Reporting workflow Extract contracts Model grain + tests Visual semantics QA Memo stakeholder read Human checkpoints after each box — no silent automation
Introduce yourself