<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Higher-Music-Education on Sebastian Spicker</title>
    <link>https://sebastianspicker.github.io/tags/higher-music-education/</link>
    <description>Recent content in Higher-Music-Education on Sebastian Spicker</description>
    
    <generator>Hugo -- 0.160.0</generator>
    <language>en</language>
    <lastBuildDate>Fri, 22 Nov 2024 00:00:00 +0000</lastBuildDate>
    <atom:link href="https://sebastianspicker.github.io/tags/higher-music-education/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>After the Connection Is Stable, the Hard Part Begins</title>
      <link>https://sebastianspicker.github.io/posts/nmp-curriculum-reflective-practice/</link>
      <pubDate>Fri, 22 Nov 2024 00:00:00 +0000</pubDate>
      <guid>https://sebastianspicker.github.io/posts/nmp-curriculum-reflective-practice/</guid>
      <description>A third post in the networked music performance series. Technical latency is solved. Institutional infrastructure has a name. What students actually learn — and what conservatoire curricula consistently get wrong about teaching it — turns out to be a different problem entirely.</description>
      <content:encoded><![CDATA[<p><em>Third post in a series. The <a href="/posts/nmp-latency-lola-mvtp/">August 2023 post</a>
covered latency measurements across six European research-network links.
The <a href="/posts/digital-music-labs-infrastructure/">June 2024 post</a> covered
what institutional infrastructure needs to look like for any of that to
be sustainably usable. This one covers what happens after both of those
problems are solved — which is when the genuinely interesting educational
challenges start.</em></p>
<p><em>Based on a manuscript with colleagues from the RAPP Lab. Not yet peer-reviewed.</em></p>
<hr>
<h2 id="the-gap-nobody-talks-about">The Gap Nobody Talks About</h2>
<p>There is a version of the NMP success story that stops too early. It goes: we
installed LoLa, measured the latency, it came in at 9.5 ms to Vienna, the
musicians played together across 745 km, it worked. Success.</p>
<p>What this story skips is the classroom after the demo. The student who can
follow a setup checklist perfectly and still has no idea what to do musically
when the connection is stable. The ensemble that gets a clean signal running
and then plays exactly the same repertoire in exactly the same way they would
in a co-present rehearsal, fighting the latency instead of working with it,
frustrated when it does not feel right. The assessment rubric that checks off
&ldquo;maintained stable connection&rdquo; and &ldquo;completed the performance&rdquo; and has nothing
to say about everything that actually constitutes musical learning in a
networked context.</p>
<p>The gap between <em>technical feasibility</em> and <em>educational transformation</em> is
the subject of this post. Closing it turns out to require a different kind of
curriculum design than most conservatoires have tried.</p>
<hr>
<h2 id="what-gets-taught-versus-what-needs-to-be-learned">What Gets Taught Versus What Needs to Be Learned</h2>
<p>The default curricular response to NMP has been to treat it as a technical
skill with an artistic application. Students learn to configure an audio
interface, manage routing, establish a LoLa connection, and then — implicitly
— go do music. The technical content gets staged as a prerequisite to the
&ldquo;real&rdquo; work.</p>
<p>This ordering is wrong in a specific way. Technical setup work is genuinely
necessary, but making it a prerequisite treats the relationship between
technology and musical practice as sequential rather than recursive. In
practice, the interesting musical problems only become visible <em>through</em> the
technical ones. A student does not understand why buffer size matters until
they have felt the difference between a 5 ms and a 40 ms offset in a
coordination-intensive passage. A student does not develop an opinion about
audio routing configurations until they have experienced a rehearsal collapse
caused by a routing error they could have prevented.</p>
<p>The RAPP Lab&rsquo;s recurring insight across several years of module iterations
at HfMT Köln was more direct: once learners can establish a stable connection,
the harder challenge is developing artistic, collaborative and reflective
strategies for making music <em>together apart</em>. Technical fluency is a
foundation, not a destination.</p>
<hr>
<h2 id="the-curriculum-we-ended-up-with">The Curriculum We Ended Up With</h2>
<p>It took several cycles to get there. The early format was weekend workshops —
open, exploratory, no formal assessment, primarily for advanced students who
self-selected in. These were useful precisely because they were informal: they
revealed quickly how technical and musical questions become inextricable once
you are actually playing, and they gave us evidence about where students got
stuck that we would not have found from a needs analysis.</p>
<p>Over time, elements of those workshops were developed into recurring
curriculum-embedded modules, which then fed into independent study projects
and eventually into external collaborations and performances. The trajectory
mattered: moving from a one-off event to something longitudinal meant that
knowledge built across cohorts rather than resetting every time.</p>
<p>The module structure that emerged has three interlocking elements:</p>
<p><strong>Progressive task design.</strong> Early sessions are tightly scoped:
specific technical-musical exercises, limited repertoire, well-defined
success criteria. Later sessions move toward open-ended projects, student-led
rehearsal planning, and eventually cross-institutional partnerships where
variables are genuinely outside anyone&rsquo;s control. The point of the early
constraints is not to make things easier — it is to create conditions where
students can notice what they are doing rather than just surviving.</p>
<p><strong>Journals and debriefs.</strong> Students kept individual reflective journals
throughout modules, documenting not just what happened but how they responded
to it — technical problems, musical decisions, moments of coordination failure
and recovery, questions they could not answer at the time. Group debriefs
after each rehearsal then turned those individual threads into collective
knowledge: comparing strategies, naming the problems that came up repeatedly,
developing shared language for rehearsal coordination.</p>
<p>The debrief is the part of this model that I think gets undervalued. It is
not just reflection — it is <em>curriculum production</em>. Strategies that emerged
from one cohort&rsquo;s debriefs became documented starting points for subsequent
cohorts. Knowledge accumulated rather than evaporating when the semester ended.</p>
<p><strong>Portfolio assessment.</strong> Rather than assessing primarily on a final
performance, students assembled portfolios that could include curated journal
excerpts, rehearsal documentation, reflective syntheses, and accounts of
how their thinking changed. The question being assessed was not &ldquo;did you play
the concert&rdquo; but &ldquo;can you articulate why you made the decisions you made, and
what you would do differently.&rdquo;</p>
<hr>
<h2 id="what-students-actually-learn-when-the-curriculum-works">What Students Actually Learn (When the Curriculum Works)</h2>
<p>Four outcomes recurred across the RAPP Lab iterations, consistently enough
to be worth naming:</p>
<h3 id="1-technical-agency">1. Technical agency</h3>
<p>This is different from technical competence. Competence means you can follow
a procedure. Agency means you understand the procedure well enough to deviate
from it intelligently when something goes wrong — to diagnose what failed,
generate a hypothesis about why, and try something different.</p>
<p>The shift happened when students stopped treating technical problems as
interruptions to the music and started treating them as information about
the system they were working inside. A dropout is not just an annoyance; it
is evidence about where the failure occurred. Getting to that reframe took,
on average, several weeks of structured reflection. It did not happen from
reading documentation.</p>
<h3 id="2-adaptive-improvisation">2. Adaptive improvisation</h3>
<p>Latency changes what real-time musical coordination can mean. You cannot rely
on the same multimodal cues — breath, gesture, shared acoustics — that make
co-present ensemble playing feel intuitive. You have to develop explicit
cueing systems, turn-taking conventions, contingency plans for when the
connection degrades mid-performance.</p>
<p>What we observed was that this constraint generated a specific kind of
musical creativity. Students improvised not just with musical material but
with rehearsal organisation itself — inventing systems, testing them,
discarding the ones that did not work, documenting the ones that did. Some of
the most musically interesting moments in the modules came from sessions where
the technology was behaving badly and students had to make it work anyway.</p>
<p>There is research on &ldquo;productive failure&rdquo; — deliberately designing tasks that
exceed students&rsquo; current control, because the struggle and recovery produces
deeper learning than smooth execution (Kapur 2016). NMP turns out to be a
natural context for this, not by design but because the network does not
cooperate on schedule.</p>
<h3 id="3-collaborative-communication">3. Collaborative communication</h3>
<p>Co-present rehearsal relies heavily on implicit communication: the
physical space makes many things legible without anyone having to say them.
In a networked rehearsal, the spatial and gestural channel is degraded or
absent. Students had to make explicit what would normally be implicit —
articulating coordination strategies, naming the problems they were
experiencing rather than hoping the ensemble would notice, developing a
vocabulary for talking about timing and latency as musical parameters.</p>
<p>This turned out to generalize. Students who had worked through several
networked rehearsal cycles were noticeably better at explicit musical
communication in co-present settings too, because they had been forced to
develop the vocabulary in a context where it was necessary.</p>
<h3 id="4-reflective-identity">4. Reflective identity</h3>
<p>The students who got the most out of the modules were the ones who stopped
waiting for the conditions to improve and started working with the conditions
as they were. Latency as a compositional constraint rather than a defect to
be routed around. Uncertainty as an artistic condition rather than a
technical failure.</p>
<p>The journal entries where this shift is most visible are not the ones that
describe what the student did. They are the ones that describe a change in
how the student understands their own practice — who they are as a musician
in relation to an environment they cannot fully control. That is a different
kind of outcome than anything a timing metric captures.</p>
<hr>
<h2 id="the-assessment-problem">The Assessment Problem</h2>
<p>The hardest part of all of this to translate into institutional language is
assessment. The conservatoire has well-developed frameworks for evaluating
performances. It has much weaker frameworks for evaluating the learning that
happens before and between and underneath performances.</p>
<p>Checklist rubrics — was the connection stable, was the latency within
acceptable range, did the performance complete — are useful for safety and
reliability. They are poor evidence for whether a student has developed the
capacity to work reflectively and artistically in a mediated ensemble
environment. A student who achieved a stable connection by following
instructions exactly and a student who achieved it by diagnosing a routing
error mid-session look identical on a checklist. They have had very different
learning experiences.</p>
<p>Portfolio assessment addresses this by making the reasoning visible. When a
student can explain why they chose a particular buffer configuration given
the specific network characteristics of that session, how that choice affected
the musical phrasing in the piece they were rehearsing, and what they would
change next time — that is evidence of something real. It is also harder to
assess than a timing log, which is probably why most programmes avoid it.</p>
<p>The argument is not that quantitative indicators are useless. It is that
they function better as scaffolding for reflective judgement than as the
primary evidence of learning. Mixed assessment ecologies — technical logs
plus journals plus portfolio syntheses — are more honest about what is
actually happening educationally.</p>
<hr>
<h2 id="what-this-does-not-solve">What This Does Not Solve</h2>
<p>The model described here depends on teaching staff who can facilitate
reflective dialogue, curate knowledge across cohorts, and participate in
iterative curriculum redesign. That is a specific professional competence
that is not automatically present in a conservatoire staffed primarily by
performing musicians. The training and support structures needed to develop
it are an open question this paper does not fully answer.</p>
<p>The curriculum is also not portable as-is. The RAPP Lab model emerged in a
specific institutional context — HfMT Köln, specific partner network,
specific funding structure, specific cohort of students. The four outcomes
and the general pedagogical logic may transfer; the specific formats will
need adaptation. Any institution that tries to implement this without going
through at least one cycle of their own iterative development is likely to
end up with a checklist version of something that works only when it is a
living process.</p>
<p>And the technology keeps moving. LoLa is a mature platform but the
ecosystem around it — network configurations, operating system support,
hardware lifecycles — changes faster than curriculum documentation. Building
responsiveness into the curriculum itself, rather than treating it as a fixed
syllabus, is the structural answer. Easier to recommend than to institutionalise.</p>
<hr>
<h2 id="references">References</h2>
<p>Barrett, H. C. (2007). Researching electronic portfolios and learner
engagement. <em>Journal of Adolescent &amp; Adult Literacy</em>, 50(6), 436–449.</p>
<p>Borgdorff, H. (2012). <em>The Conflict of the Faculties.</em> Leiden University Press.</p>
<p>The Design-Based Research Collective (2003). Design-based research: An
emerging paradigm for educational inquiry. <em>Educational Researcher</em>, 32(1),
5–8.</p>
<p>Kapur, M. (2016). Examining productive failure, productive success,
unproductive failure, and unproductive success in learning. <em>Educational
Psychologist</em>, 51(2), 289–299. <a href="https://doi.org/10.1080/00461520.2016.1155457">https://doi.org/10.1080/00461520.2016.1155457</a></p>
<p>Lave, J. &amp; Wenger, E. (1991). <em>Situated Learning.</em> Cambridge University Press.</p>
<p>Sadler, D. R. (2009). Indeterminacy in the use of preset criteria for
assessment and grading. <em>Assessment &amp; Evaluation in Higher Education</em>,
34(2), 159–179. <a href="https://doi.org/10.1080/02602930801956059">https://doi.org/10.1080/02602930801956059</a></p>
<p>Schön, D. A. (1983). <em>The Reflective Practitioner.</em> Basic Books.</p>
<p>Wenger, E. (1998). <em>Communities of Practice.</em> Cambridge University Press.
<a href="https://doi.org/10.1017/CBO9780511803932">https://doi.org/10.1017/CBO9780511803932</a></p>
<hr>
<h2 id="changelog">Changelog</h2>
<ul>
<li><strong>2026-01-20</strong>: Updated the Sadler (2009) reference title to &ldquo;Indeterminacy in the use of preset criteria for assessment and grading,&rdquo; matching the journal article at this DOI. Updated the Kapur (2016) reference to the full published title: &ldquo;Examining productive failure, productive success, unproductive failure, and unproductive success in learning.&rdquo;</li>
</ul>
]]></content:encoded>
    </item>
  </channel>
</rss>
