2025Define your quiet: Aligning shared expectations in public spaces through participatory research
Previous responses focused on enforcement—more signage, stricter rules and staff partols—but complaints persisted. The problem wasn’t being resolved through authority-based interventions.
Rather than escalating enforcement, I proposed stepping back to define the problem better through research—to understand how users interpreted ‘Low Noise’, ‘Quiet’ and ‘Silent’ zones differently in practice. The goal was to inform any future interventions with lived expectations and evidence, not policy assumptions.
A low‑fidelity, interactive polling installation was placed at the main library entrance, allowing users to quickly and anonymously express what different noise levels meant to them. The interaction was deliberately playful and non‑authoritative, avoiding the tone of enforcement or judgement. It made clear this was their space to be heard.
This qualitative approach was supported by a quantitative synthesis of complaint data, allowing us to compare expectations against reported issues.
Findings
In one month, the installation received 4,000 responses—a 4,000% increase compared to a previous online survey. Beyond volume, the public nature of the interaction surfaced an unexpected effect: users could see how others voted, prompting self‑reflection and informal self‑regulation without staff intervention.
Data showed strong consensus around ‘Silent’ zones and larger tolerance in ‘Low Noise’ areas, but significant misalignment around ‘Quiet’ understandings. This highlighted the issue was not behaviour, but ambiguity: ‘Quiet’ meant different things to different people. Seeing the diversity and complexity of responses in real time helped users and staff understand why noise expectations couldn’t be resolved through simple rules or enforcement. This reduced misplaced frustration toward frontline staff and repositioned noise as a collective, systemic issue rather than individual misconduct.
This insight explained why enforcement continued to fail—and why frontline staff were unfairly positioned as policers of unclear rules.
Outcomes
Based on these findings, the Library co‑designed clearer, shared definitions for each noise zone and translated them into spatial signage, communications and policy. These definitions gave users a common reference point and equipped staff with a neutral, non‑confrontational way to resolve issues.
Live trials validated the approach, reducing ambiguity, lowering staff intervention and reframing noise as a shared responsibility rather than individual misconduct.
The work demonstrates how participatory research and light‑touch interventions can resolve systemic tensions in shared environments—by clarifying meaning before enforcing rules.
Define your quiet: Aligning shared expectations in public spaces through participatory research
The challenge
UTS Library was experiencing high volumes of recurring noise complaints. This reduced trust in the library as a reliable study environment and placed emotional strain on frontline staff who were repeatedly pulled away from core work to confront users.Previous responses focused on enforcement—more signage, stricter rules and staff partols—but complaints persisted. The problem wasn’t being resolved through authority-based interventions.
Approach
Through early reflection and stakeholder conversations, we challenged the assumption that noise was a compliance issue. Instead, we identified a systemic problem: it wasn't that users didn’t care or were rude, it was that users didn’t share a common understanding of what ‘quiet’ actually meant.Rather than escalating enforcement, I proposed stepping back to define the problem better through research—to understand how users interpreted ‘Low Noise’, ‘Quiet’ and ‘Silent’ zones differently in practice. The goal was to inform any future interventions with lived expectations and evidence, not policy assumptions.
Methods
I designed an in-situ, mixed-methods research approach that recognised the time-poor, cognitively loaded states of our users and prioritised low friction engagement.A low‑fidelity, interactive polling installation was placed at the main library entrance, allowing users to quickly and anonymously express what different noise levels meant to them. The interaction was deliberately playful and non‑authoritative, avoiding the tone of enforcement or judgement. It made clear this was their space to be heard.
This qualitative approach was supported by a quantitative synthesis of complaint data, allowing us to compare expectations against reported issues.
Findings
In one month, the installation received 4,000 responses—a 4,000% increase compared to a previous online survey. Beyond volume, the public nature of the interaction surfaced an unexpected effect: users could see how others voted, prompting self‑reflection and informal self‑regulation without staff intervention. Data showed strong consensus around ‘Silent’ zones and larger tolerance in ‘Low Noise’ areas, but significant misalignment around ‘Quiet’ understandings. This highlighted the issue was not behaviour, but ambiguity: ‘Quiet’ meant different things to different people. Seeing the diversity and complexity of responses in real time helped users and staff understand why noise expectations couldn’t be resolved through simple rules or enforcement. This reduced misplaced frustration toward frontline staff and repositioned noise as a collective, systemic issue rather than individual misconduct.
This insight explained why enforcement continued to fail—and why frontline staff were unfairly positioned as policers of unclear rules.
Outcomes
Based on these findings, the Library co‑designed clearer, shared definitions for each noise zone and translated them into spatial signage, communications and policy. These definitions gave users a common reference point and equipped staff with a neutral, non‑confrontational way to resolve issues.Live trials validated the approach, reducing ambiguity, lowering staff intervention and reframing noise as a shared responsibility rather than individual misconduct.
Impact
This project shifted noise management from policing behaviour to designing shared expectations. By sitting with uncertainty and involving users in defining the problem, the Library strengthened trust, reduced staff emotional labour and improved service clarity without additional resourcing.The work demonstrates how participatory research and light‑touch interventions can resolve systemic tensions in shared environments—by clarifying meaning before enforcing rules.
2025Define your quiet: Aligning shared expectations in public spaces through participatory research
The challenge
UTS Library was experiencing high volumes of recurring noise complaints. This reduced trust in the library as a reliable study environment and placed emotional strain on frontline staff who were repeatedly pulled away from core work to confront users.
Previous responses focused on enforcement—more signage, stricter rules and staff partols—but complaints persisted. The problem wasn’t being resolved through authority-based interventions.
Approach
Through early reflection and stakeholder conversations, we challenged the assumption that noise was a compliance issue. Instead, we identified a systemic problem: it wasn't that users didn’t care or were rude, it was that users didn’t share a common understanding of what ‘quiet’ actually meant.
Rather than escalating enforcement, I proposed stepping back to define the problem better through research—to understand how users interpreted ‘Low Noise’, ‘Quiet’ and ‘Silent’ zones differently in practice. The goal was to inform any future interventions with lived expectations and evidence, not policy assumptions.
Methods
I designed an in-situ, mixed-methods research approach that recognised the time-poor, cognitively loaded states of our users and prioritised low friction engagement.
A low‑fidelity, interactive polling installation was placed at the main library entrance, allowing users to quickly and anonymously express what different noise levels meant to them. The interaction was deliberately playful and non‑authoritative, avoiding the tone of enforcement or judgement. It made clear this was their space to be heard.
This qualitative approach was supported by a quantitative synthesis of complaint data, allowing us to compare expectations against reported issues.
Findings
In one month, the installation received 4,000 responses—a 4,000% increase compared to a previous online survey. Beyond volume, the public nature of the interaction surfaced an unexpected effect: users could see how others voted, prompting self‑reflection and informal self‑regulation without staff intervention.
Data showed strong consensus around ‘Silent’ zones and larger tolerance in ‘Low Noise’ areas, but significant misalignment around ‘Quiet’ understandings. This highlighted the issue was not behaviour, but ambiguity: ‘Quiet’ meant different things to different people. Seeing the diversity and complexity of responses in real time helped users and staff understand why noise expectations couldn’t be resolved through simple rules or enforcement. This reduced misplaced frustration toward frontline staff and repositioned noise as a collective, systemic issue rather than individual misconduct.
This insight explained why enforcement continued to fail—and why frontline staff were unfairly positioned as policers of unclear rules.
Outcomes
Based on these findings, the Library co‑designed clearer, shared definitions for each noise zone and translated them into spatial signage, communications and policy. These definitions gave users a common reference point and equipped staff with a neutral, non‑confrontational way to resolve issues.
Live trials validated the approach, reducing ambiguity, lowering staff intervention and reframing noise as a shared responsibility rather than individual misconduct.
Impact
This project shifted noise management from policing behaviour to designing shared expectations. By sitting with uncertainty and involving users in defining the problem, the Library strengthened trust, reduced staff emotional labour and improved service clarity without additional resourcing.
The work demonstrates how participatory research and light‑touch interventions can resolve systemic tensions in shared environments—by clarifying meaning before enforcing rules.
Define your quiet: Aligning shared expectations in public spaces through participatory research
The challenge
UTS Library was experiencing high volumes of recurring noise complaints. This reduced trust in the library as a reliable study environment and placed emotional strain on frontline staff who were repeatedly pulled away from core work to confront users. Previous responses focused on enforcement—more signage, stricter rules and staff partols—but complaints persisted. The problem wasn’t being resolved through authority-based interventions.
Approach
Through early reflection and stakeholder conversations, we challenged the assumption that noise was a compliance issue. Instead, we identified a systemic problem: it wasn't that users didn’t care or were rude, it was that users didn’t share a common understanding of what ‘quiet’ actually meant.Rather than escalating enforcement, I proposed stepping back to define the problem better through research—to understand how users interpreted ‘Low Noise’, ‘Quiet’ and ‘Silent’ zones differently in practice. The goal was to inform any future interventions with lived expectations and evidence, not policy assumptions.
Methods
I designed an in-situ, mixed-methods research approach that recognised the time-poor, cognitively loaded states of our users and prioritised low friction engagement. A low‑fidelity, interactive polling installation was placed at the main library entrance, allowing users to quickly and anonymously express what different noise levels meant to them. The interaction was deliberately playful and non‑authoritative, avoiding the tone of enforcement or judgement. It made clear this was their space to be heard.
This qualitative approach was supported by a quantitative synthesis of complaint data, allowing us to compare expectations against reported issues.
Findings
In one month, the installation received 4,000 responses—a 4,000% increase compared to a previous online survey. Beyond volume, the public nature of the interaction surfaced an unexpected effect: users could see how others voted, prompting self‑reflection and informal self‑regulation without staff intervention. Data showed strong consensus around ‘Silent’ zones and larger tolerance in ‘Low Noise’ areas, but significant misalignment around ‘Quiet’ understandings. This highlighted the issue was not behaviour, but ambiguity: ‘Quiet’ meant different things to different people. Seeing the diversity and complexity of responses in real time helped users and staff understand why noise expectations couldn’t be resolved through simple rules or enforcement. This reduced misplaced frustration toward frontline staff and repositioned noise as a collective, systemic issue rather than individual misconduct.
This insight explained why enforcement continued to fail—and why frontline staff were unfairly positioned as policers of unclear rules.
Outcomes
Based on these findings, the Library co‑designed clearer, shared definitions for each noise zone and translated them into spatial signage, communications and policy. These definitions gave users a common reference point and equipped staff with a neutral, non‑confrontational way to resolve issues.Live trials validated the approach, reducing ambiguity, lowering staff intervention and reframing noise as a shared responsibility rather than individual misconduct.
Impact
This project shifted noise management from policing behaviour to designing shared expectations. By sitting with uncertainty and involving users in defining the problem, the Library strengthened trust, reduced staff emotional labour and improved service clarity without additional resourcing.The work demonstrates how participatory research and light‑touch interventions can resolve systemic tensions in shared environments—by clarifying meaning before enforcing rules.