In February, the Jericho Union Free School District asked the community a single question: as we develop our strategic plan, what priorities should guide our schools to ensure student success? The answer arrived at meaningful scale. The ThoughtExchange platform reports 718 participants, 351 thoughts shared, and 12,676 ratings cast on each other's contributions, in eight languages. (How the platform defines a "participant" — whether unique humans, unique logins, or unique sessions — is not disclosed in the materials shared with the community, a point worth returning to.)
By the community's own vote, one idea rose above all the others. The single highest-rated thought, ranked first of all 351, was this: bring in the best teachers.
Three months later, the district has presented two decks shaping that engagement into a plan. The first, in February, identified seven priorities and omitted teacher excellence entirely. The second, presented on May 7, restored "Recruit the Best Teachers" as a labeled tile on one slide and named "Dedicated Staff Teachers" as the district's highest-rated Strength on another (4.5-4.6 out of 5.0) — and then dropped teachers again from every strategic goal, every objective, and every measure of success in the plan itself.
The other priorities the community ranked highly haven't fared much better. Healthier lunch is gone. Whole-student readiness is gone. College and career counseling, named repeatedly in top-rated thoughts, is gone. The Robotics Club — listed as one of seven "Identified Priorities" in February — has quietly disappeared by May.
What remains is a three-goal plan dominated by a consultant's frameworks: Student Well-being, Resource Utilization, and Safe & Future-Ready. There are six objectives. There are no action steps yet shared. The most consequential synthesis steps — how 351 community thoughts became seven priorities, and then three goals — happened off-stage, with no documented criteria.
Between the survey and Thursday night's deck, the community participated a second time: at an in-person Strategic Planning Forum on March 17 at the MS Library, with both morning and evening sessions. That forum was the consultation step. The May 7 deck is the result. The community's remaining structured opportunity to push back is the Board of Education public presentation, where the plan is expected to be formally adopted. What follows is an effort to read the plan against its own community survey, in plain language — and then to ask the harder question: even if the plan reflected the survey perfectly, would it be a strategic plan?
What the community actually said
The engagement was conducted via ThoughtExchange, a platform that lets participants share ideas and rate one another's. Across 351 thoughts and 12,676 ratings, several clusters dominated:
Teachers. The single most-rated and highest-scoring thought was "Bring in the best teachers." Reinforced by roughly half a dozen other top-50 thoughts about recruitment, retention, performance review, and the importance of teacher quality to Jericho's reputation.
Devices, screens and gaming. Multiple top-five thoughts asked the district to manage Chromebooks more tightly, block gaming sites, reduce screen time generally, and bring back hands-on, paper-based learning.
Mental health, anti-bullying, safety and belonging. A large cluster of top-rated thoughts framed student well-being not as separate from learning but as the foundation of it.
Whole-student readiness. Ideas about character, leadership, public speaking, life skills, college counseling, and preparing students for the world after Jericho.
Lunch and the school-day schedule. Specific, concrete ideas about food quality, portion size, lunch lines and time to eat — especially at the high school.
These are the priorities a fair reader would draw from the data. Two of the original survey responses among the top five capture the dominant tone:
"The best students need the best teachers."
"Reduce screen-based work and increase hands-on work — important for the growth of young brains."
Twenty-two percent of participation came from non-English speakers, principally Mandarin, with Korean, Spanish, Russian, Japanese and Polish also represented. That multilingual reach is one of the strongest signals of legitimacy in the entire engagement.
Top community priorities mapped to the draft strategic plan
Community priority (paraphrased from top-rated thoughts) | Star score | Where it lives in the draft plan | Status |
Recruit the best teachers (recruitment, retention, performance review) | 4.5 | Not in any goal or objective. Not on the Indicators slide. | ABSENT |
Manage Chromebooks / block gaming sites | 4.4 | Goal 2 / Obj 2.2 ("reduce non-educational screen time by 30%") | PARTIAL |
Reduce screen-based work; increase hands-on learning | 4.4 | Goal 2 / Obj 2.1 ("evaluate classroom technology") | PARTIAL |
Limit Chromebook to pre-approved sites | 4.4 | Implied within Goal 2 / Obj 2.2; no commitment named | PARTIAL |
Diverse activities — sports, arts, technology, exchanges | 4.4 | Not addressed | ABSENT |
Maintain Jericho's reputation for excellence | 4.4 | Not addressed as a goal | ABSENT |
Attract and retain the best teachers | 4.4 | Not in any goal or objective | ABSENT |
Anti-bullying | 4.4 | Demoted to a sub-clause of Goal 3 / Obj 3.1 (alongside security personnel) | DEMOTED |
Healthy food / less processed; bigger portions | 4.3 | Not addressed | ABSENT |
Restrict gaming on Chromebook (esp. recess) | 4.3 | Implied within Goal 2 | PARTIAL |
A qualified, good teacher (teacher quality narrative) | 4.3 | Not addressed | ABSENT |
Block web-based gaming; secure school-issued devices | 4.3 | Implied within Goal 2 | PARTIAL |
Bring back physical books, pencil and paper | 4.3 | Implied within Goal 2; no commitment named | PARTIAL |
Dedicated guidance counselors for college process (9-12) | 4.3 | Not addressed | ABSENT |
Schools need a safe environment | 4.3 | Goal 3 / Obj 3.1 (security personnel) | PRESENT |
Comprehensive guidance — academic, SEL, career | 4.3 | Partial via Goal 1 (well-being); no career/college work | PARTIAL |
Safe, inclusive environment | 4.3 | Goal 1 / Obj 1.1 (general well-being) | PARTIAL |
Consolidate apps; reduce switching for schoolwork | 4.3 | Not addressed | ABSENT |
Positive reinforcement / safe learning environment | 4.3 | Loosely covered by Goal 1 | PARTIAL |
Prompt feedback on assessments from teachers | 4.3 | Not addressed | ABSENT |
Lunch time / lunch breaks for high schoolers | 4.3 | Not addressed | ABSENT |
Relaunch high school robotics club / strengthen STEM | 4.2 | Not addressed (named as a priority in Feb deck; dropped) | ABSENT |
Attract and retain high-quality teachers | 4.2 | Not addressed | ABSENT |
Strong, well-supported teachers using research-based instruction | 4.2 | Not addressed | ABSENT |
Build character; prepare for the real world | 4.2 | Not addressed | ABSENT |
Prioritize student mental health and well-being | 4.2 | Goal 1 / Obj 1.1 | PRESENT |
Reduce time on electronic devices | 4.2 | Goal 2 / Obj 2.2 | PRESENT |
Healthier, more diversified lunch (low sugar, low salt) | 4.2 | Not addressed | ABSENT |
Review of guidance counselor / teacher performance | 4.2 | Not addressed | ABSENT |
Communication between parents and teachers | 4.2 | Not addressed | ABSENT |
College and life readiness | 4.2 | Not addressed | ABSENT |
Maintain small class size | 4.2 | Not addressed | ABSENT |
Lunch lines / portion size at the high school | 4.2 | Not addressed | ABSENT |
Two decks, two disappearances
The first community-facing artifact was the February Strategic Planning Forum deck, prepared by TLE Consulting Group, the firm partnering with the district on the plan. It opened with the participation statistics — 718, 351, 12,676 — and then proceeded to a slide titled "Identified Priorities," which listed seven items: Artificial Intelligence, Computer Games, Healthy Meal Options, Homework, Parent Communication, Robotics Club, and Social-Emotional/Mental Health.
Teacher excellence — by the community's own ratings, the single most-supported idea — was not among the seven.
This past week's follow-up deck attempts a partial correction. A revised "Strategic Planning Priorities" slide now displays four community-derived priorities as labeled tiles, ranked by survey rating: "Recruit the Best Teachers (4.5/5.0)," "Computer & Online Management (4.4/5.0)," "Increase Hands-on Work (4.4/5.0)," and "Limit Chromebook Access (4.4/5.0)."
Then the deck moves to its actual product: three Strategic Goals, each with two Objectives. None of the three goals concerns teacher excellence. None of the six objectives. No metric on the Indicators of Strategic Success slide tracks teacher recruitment, retention, performance, or quality.
Putting the priority on a tile and not in the plan is not representation. It is the appearance of representation.
There is no Goal 4. There is no objective on competitive recruitment, retention strategy, professional development investment, or transparent performance review. The community's clearest signal is acknowledged on slide 9 and forgotten by slide 10.
The three goals, inspected
Read the goals carefully. Each one is worth holding up against what the community actually said.
Goal 1 — Student Well-being.
Defensible on its face. Mental health, belonging, and reduced anxiety are real community priorities. The objectives, however, raise questions. Objective 1.1 commits to "research and implement a school-wide mental health initiative by 2027-28 school year, aiming to reduce student anxiety as measured by wellness assessments." Which assessments? Has any baseline been measured in Jericho? If not, the metric does not yet exist. Objective 1.2 commits to training 100% of staff in mental health awareness — a process target that says nothing about whether students are healthier.
Goal 2 — Resource Utilization.
This is the goal that most directly addresses the community's screen-time concerns, and it is also the goal most thoroughly rebranded by consultant language. The community asked, in plain English, for less Chromebook gaming, fewer apps, more pencil and paper, and more hands-on work. The strategic goal calls this "Resource Utilization for Enhanced and Engaging Academic Learning Environment."
Objective 2.1 commits only to "evaluate" classroom technology — that is, to study the problem rather than fix it. Objective 2.2 promises to "reduce non-educational screen time by 30% and increase student focus and engagement by 20%." Where does 30% come from? From what baseline? What instrument will measure non-educational screen time — Chromebook telemetry, classroom observation, self-report? What instrument measures focus and engagement? Without methodology, specific numbers are not measurement; they are theatre.
Goal 3 — Safe & Future-Ready.
This is two unrelated goals stitched together. Objective 3.1 covers full-time security personnel and a comprehensive anti-bullying program — both responsive to community concern. Objective 3.2 commits to "develop and implement an AI integration plan that trains staff and students in ethical, effective AI use."
Two questions follow. First: where in the survey is the community demand for an AI integration plan? AI appears as one of nineteen keyword themes in the underlying ThoughtExchange data, but it is not among the most-rated standalone thoughts. Second: why is anti-bullying — a top-ten community concern in its own right — demoted to a sub-clause of an objective that begins with security personnel? Anti-bullying deserves its own objective, with its own metric, and arguably its own goal.
What is gone
It is worth listing what survived in the February forum deck and has now disappeared. The May deck reduces the February seven "Identified Priorities" to four "Themes": Homework & Robotics Club, Artificial Intelligence & Computer Games, Social Emotional Learning, and Parent Communications. From there, only two of the original seven (mental health and, loosely, AI) make it into the strategic goals. Healthy Meal Options has been dropped entirely between February and May — gone from the themes, gone from the goals, gone from the metrics. The other priorities the consultant itself named in February and has since dropped from the goal architecture:
Healthy Meal Options.
Homework.
Parent Communication.
Robotics Club.
Computer Games (now reframed inside Goal 2 in the most abstract possible terms).
And from the underlying community data, additional priorities that earned high ratings but never made any deck:
Whole-student readiness, including character, leadership, public speaking, and life skills.
Dedicated college and career counseling beginning in middle school.
Smaller class sizes.
Faster, more consistent feedback on assessments.
App consolidation — students juggling too many digital platforms to find and complete homework.
It is reasonable to expect a strategic plan to make trade-offs. It is not reasonable to make those trade-offs invisible. There is no document, no slide, no companion piece that explains why these priorities were considered and excluded. The community is asked to trust the result without seeing the reasoning.
The plan against its own SOAR
The May 7 deck includes a SOAR analysis — Strengths, Opportunities, Aspirations, Results — that February's deck did not. Reading the SOAR carefully reveals a second, more remarkable problem. The plan does not just diverge from the underlying community survey. It diverges from the deck's own Strengths and Aspirations slides, presented in the same room, on the same night, by the same people.
Strengths.
On the "Strengths: Staff and Community" slide, the deck names "Dedicated Staff Teachers" as a top Jericho strength, with a community rating of 4.5-4.6 out of 5.0 — the highest rating shown anywhere in the deck. Recognized as the district's strongest asset, by the deck's own data. Yet none of the three strategic goals concerns teachers. The strongest asset Jericho has, by the deck's own ratings, has no objective for protecting, developing, or extending it. Acknowledging teachers as a strength while declining to invest in them is the cleanest example of the gap between what the deck sees and what the plan does.
Aspirations.
The Aspirations section lists six community aspirations with their ratings. Five cluster between 4.3 and 4.4: Mental Health & Well-Being (4.3), Academic Excellence & Balance (4.3), Community Engagement (4.4), Character Building (4.3), and Holistic Development. The sixth — Technology & Innovation — is rated 3.5, the lowest rating shown anywhere in the deck and well below every other listed item, community priority, theme, or strength.
The plan's three goals correspond as follows. Mental Health is reflected in Goal 1. Technology & Innovation — the 3.5-rated aspiration — is reflected in Goal 3.2, the AI integration plan. None of the four other 4.3-4.4 aspirations corresponds to any goal or objective.
The only aspiration the community rated below 4.0 is the one the plan turned into a strategic goal. The aspirations rated 4.3-4.4 are not in the plan.
This is worth pausing on. "Character Building" is listed at 4.3 with the explicit community quotation "to build student character and values, not just academics" — language that directly validates the whole-student-readiness cluster from the underlying survey. It has no goal. "Community Engagement" is the highest-rated aspiration at 4.4, with the quotation "maintain the sense of community we have always felt." It has no goal. "Academic Excellence & Balance" — "promote a healthy balance between school work and life" — sits at 4.3. It has no goal.
Opportunities.
The Opportunities slides list nine items: Student Engagement, Balancing Excellence, Technology Calibration, Stress & Anxiety, SEL Integration, Parental Pressure, Consistent Discipline, IEP & Special Education, and Anti-Bullying Programs. Of these, two are reflected in the goals (Stress & Anxiety / SEL via Goal 1; Anti-Bullying as a sub-clause of Goal 3.1). Seven are not addressed.
The plan does not reflect even the deck's own SOAR. The Strengths it claims, it does not invest in. The Aspirations it cites, it inverts. The Opportunities it surfaces, it largely leaves.
The 718 is the wrong question
Set aside, for a moment, the gap between what the community said and what the plan commits to. Even at face value, a strategic plan for a district of roughly three thousand students cannot rest on 718 self-selected respondents — and the 718 itself deserves a closer look. The platform reports 718 "Participants," but the deduplication method is not disclosed. A single person on multiple devices may be counted as two; a household with two parents but one login is counted as one; an anonymous re-entry may be counted separately. The number is what the platform tells us; the population behind it is not transparent.
Beyond the count, ThoughtExchange's design rewards consensus around early-popular ideas — thoughts that catch on early get more visibility, which generates more ratings, which keeps the visibility going. Self-selection layers on top of that. People who are most engaged, most concerned, or most online are overrepresented; people who are working two jobs, who are skeptical that participation matters, or who don't speak English fluently are underrepresented. Multilingual reach helps but doesn't close the gap.
This is not a criticism of the participants. It is a criticism of treating their input as the answer rather than as one of several. A defensible strategic-planning process treats a community survey as one signal among several, and is explicit about its limitations. The current draft is built almost entirely on this single signal.
When 4.2 beats 4.1: the ranking that isn't
Both the February forum deck and this week's slides treat star ratings as a clean ordinal ranking. Recruit the Best Teachers won at 4.5. Computer & Online Management came in second at 4.4. Three more priorities tied at 4.4. The keyword themes underneath ranged from 3.8 (Summer Camps) to 4.5 (Playing Games).
On a five-point scale, with hundreds of respondents, with no published confidence intervals, and with a population of self-selected raters, differences of one or two tenths of a point fall well within statistical noise. The entire spread of nineteen survey themes is 0.7 points. Treating 4.2 as a winner over 4.1 — or 4.5 as a winner over 4.4 — is presentation, not statistics. There is no published methodology, no significance testing, no confidence interval, no acknowledgement of inter-rater variability. The numbers look precise, and they aren't.
This matters because both decks use the rating numbers as if they conferred authority. They don't. They are a useful way to surface clusters of community concern. They are not a ranking that justifies allocating millions of dollars and three years of student experience to one priority over another. Surveys like this answer the question: what are people thinking about? They do not answer the question: what should the district do?
Where is the evidence?
Most striking, and most consequential, both decks treat the community survey as effectively the only quantitative input. There are no citations of peer-reviewed education research. There is no peer-district comparator analysis. There is no Jericho district data of the kind a district routinely tracks — enrollment trends, achievement, demographic shifts, staffing patterns, attendance, mental health prevalence, AP performance, or graduation outcomes. The May 7 deck does cite a single external rating — a Niche.com grade of A+ on the Strengths slide — but a third-party rating site is not a substitute for district-level outcome data, and a single grade is not comparator analysis. The "State of the Schools" slide in the February deck was a section divider with no data behind it; the May deck adds Strengths slides that show survey-derived ratings but still no district outcome data.
This matters most for the plan's most concrete commitment: "reduce non-educational screen time by 30% and increase student focus and engagement by 20%."
The actual research literature on screen time and learning is genuinely mixed. The American Academy of Pediatrics has published guidance on recreational screen use, but the picture for instructional technology is much more nuanced. Quality of use generally matters more than quantity. Adaptive learning platforms have a credible evidence base for some students and contexts; recreational scrolling does not. Differential effects by age, by subject, by socioeconomic status, and by what students are doing on the device are well established in the field. None of this nuance appears in the plan.
A 30% reduction without a defined baseline, a specified instrument, a literature foundation, or a pilot is not a strategy. It is an uncontrolled experiment on real students.
The community's intuition — that something is off about device use in school — is real and worth honoring. But translating that intuition into a 30% reduction target across a three-year plan, with no baseline measurement, no literature review, no comparator analysis, and no pilot, is not the same thing as solving the problem the community raised. It is moving fast in a direction that polled well. That is not strategy.
What the plan should have considered
A defensible strategic-planning process triangulates more than one input. At minimum:
Educator perspective — what teachers, who deliver the plan, know about what works in Jericho classrooms today.
Student voice — typically the missing input in these processes, and the people most directly affected by every objective.
Independent research — peer-reviewed evidence on the interventions being proposed, not the popularity of the topics.
District data — the actual baselines that any quantitative target requires in order to be measurable.
Comparator districts — what high-performing peer districts (academically, demographically) are doing differently.
Pilot results — small tests before three-year, district-wide commitments.
Most of these inputs are absent from both decks the community has seen. The plan is built on a community survey filtered through a consulting framework, with consultant-produced goal language layered on top. That isn't strategy. It is a polite synthesis of one stakeholder group's preferences, dressed up in the language of strategy.
In fairness, the May 7 deck does include a "Methods for Assessing Progress" slide that promises, going forward, regular surveys with teachers, students, and parents; involving teachers in critical decision-making and committees; and continuous data collection with tools to measure stress and anxiety. Those are the right intentions. But the current draft of the strategic plan was produced without those inputs in the room. Promising to consult educators and students after the goals are drafted is not the same as consulting them while the goals are being drafted. The plan in front of the community now reflects what was missing during the drafting, not what is promised for the implementation.
The metrics look measurable. They aren't yet.
Three of the six objectives in the new deck contain numerical targets: a 100% staff training rate, a 30% reduction in non-educational screen time, and a 20% increase in student focus and engagement. The remaining three objectives commit only to research, implementation, or development of plans — that is, to producing further plans.
A defensible target requires three things: a baseline, an instrument, and a definition. The current draft has none of those publicly documented. "Wellness assessments" are referenced without naming the tool. "Non-educational screen time" is not yet measured in Jericho, as far as the community has been told. "Student focus and engagement" has no instrument named at all.
The Indicators of Strategic Success slide, intended to define what success looks like, lists three indicators: "Visible Milestones," "Well-being Monitoring," and "Regular Updates." Not one of these is a number. Not one names an instrument. "Regular Updates" is a communications activity, not a results indicator. As written, this plan would not let a future board member, parent, or superintendent say with confidence whether it succeeded or failed.
The plan does the easy work and skips the hard
There is a final problem with this draft that becomes obvious once the goals are read against what an actual strategic plan is for. Strategic plans exist for problems that take years and capacity to solve — recruiting and retaining excellent teachers, redesigning curriculum, building mental-health infrastructure, transforming culture, aligning K-12 readiness. These are the things that require multi-year commitment, cross-functional coordination, and cultural change. They cannot be done in a sprint.
The Jericho draft inverts that logic. Several of its most concrete commitments are not strategic at all. They are operational tasks the district's IT and administrative teams could complete in days or weeks, not years.
Walk through the six objectives with that lens. Objective 2.2 — "reduce non-educational screen time by 30% and increase student focus and engagement by 20%" — is, in practical terms, a network-level filter rule, an updated Acceptable Use Policy, and an allowlist on school-issued Chromebooks. District IT with the right authority can complete it in an afternoon. The 30% becomes whatever the district wants it to be once the filter is on. Objective 2.1 — "evaluate classroom technology and use" — is a two-week audit, not a multi-year goal. Objective 1.2 — "train 100% of staff in mental health awareness" — is a single professional development day with a competent vendor. Objective 3.1 mixes a personnel hire (security) with a policy document (anti-bullying); both are standard district administration. Objective 3.2's AI integration plan is, again, a policy plus PD package.
That leaves Objective 1.1 — "research and implement a school-wide mental health initiative by 2027-28" — as arguably the only genuinely multi-year, capacity-building objective in the entire plan. And it is the one with the squishiest target ("reduce student anxiety"), no named instrument, and no published baseline.
Meanwhile, the actually strategic work — the work that takes years, leadership, and cultural change — is missing or framed in language too abstract to act on. Recruit and retain excellent teachers? Not a goal. Redesign curriculum for whole-student readiness? Not a goal. Build a real K-12 college and career counseling pipeline? Not a goal. Develop the actual clinical capacity to follow through on the mental-health intent of Goal 1? Not specified.
If the district can reduce non-educational screen time by 30% in three years, it can do it in three weeks. The 2027-28 timeline isn't an honest expression of difficulty. It's padding that makes operational items look strategic — and gives the harder work a place to hide.
A strategic plan that consists primarily of operational items the district should already be doing — packaged with multi-year timelines and percentage targets that look impressive — is not a strategic plan. It is a to-do list with consultant formatting. Worse, it crowds out the place where the actually strategic work belongs. Every objective spent dressing up an IT filter rule is an objective not spent on teacher excellence, curriculum redesign, or clinical mental-health capacity.
The process problem
Step back from the slides. The engagement-to-decision pipeline went like this: in February, the community produced 351 thoughts and 12,676 ratings. Some unseen process compressed those to seven priorities, presented in the February forum deck. The community then gathered in person at the March 17 forum to provide additional input. Another unseen process compressed everything to three goals and six objectives, presented on May 7. The plan is now headed to the Board of Education for adoption.
Two compression steps happened off-stage with no documented criteria. The most consequential decisions of the entire process — what survived and what didn't — were made without a public methodology. There is no "you said / we will do" matrix mapping community themes to plan responses. There is no published rationale for excluded priorities. The community is being asked to ratify a synthesis without seeing the math.
The flyer for the March 17 forum, visible in the May 7 deck itself, said the gathering would let attendees "Review responses from our community feedback survey, Provide additional input and ideas, Help shape district priorities." The community showed up. Seven weeks later, on May 7, the community received back a draft strategic plan that omits its top-rated priority (teachers), inverts its own SOAR's aspiration ratings (elevating the 3.5-rated Tech & Innovation while skipping the 4.3-4.4 aspirations), and drops Healthy Meal Options entirely between February and May. If the March 17 forum was a real consultation, its influence on the result is not visible in the artifact.
There is no "you said / we will do" matrix. There is no published rationale for excluded priorities. The community is being asked to ratify a synthesis without seeing the math.
It is also worth noting that the strategic-planning process explicitly extends the consulting engagement past plan publication. Phase 4, "Coaching," commits to ongoing leadership development and quarterly progress reviews led by TLE Group. A plan that requires ongoing external coaching to operate has not been internalized; it has been outsourced.
What a better plan would look like
The strongest argument against the current draft is not what's wrong with it — it's that a more honest version of the same exercise would have produced something different. A plan grounded in community signal, district data, and external evidence — rather than community signal alone — would name something like five goals, each with measurable objectives:
Teacher Excellence. Recruitment, retention, performance review, professional development, and competitive compensation. The single highest-rated community priority, with explicit objectives and metrics tied to outcomes that families can see and feel.
Healthy Digital Balance. The community's top-rated screen-time and gaming concerns, framed not as a discipline issue but as a pedagogical one. Reduce non-educational device time, consolidate apps, restore physical books and pencil-and-paper work where appropriate, and define how each will be measured.
Student Well-being, Safety and Belonging. Mental health, anti-bullying, safe-and-inclusive environments, and protected time for athletics, arts, and clubs. Anti-bullying gets its own objective, not a sub-clause.
Whole-Student Readiness. Academic rigor combined with character, leadership, public speaking, life skills, and dedicated K-12 college and career counseling. The cluster the community kept describing as: give them tools, not just diplomas.
Daily Experience. Lunch quality, lunch time, schedules, smaller-class staffing where it matters, faster feedback on assessments, and stronger parent-teacher communication. The everyday-quality-of-life cluster that doesn't make headlines but defines whether families feel served.
AI deserves a single objective inside Teacher Excellence or Whole-Student Readiness — wherever the district has actual conviction about its role in teaching and learning — not its own goal stapled onto building security.
Each goal would be paired with a baseline metric, a target, a named measurement instrument, an owner, and a public review cadence. Each one would cite the evidence — district data and external research — that justifies the target chosen.
Questions worth bringing to the Board of Education 1. The single highest-rated community thought was "Bring in the best teachers." Why is teacher excellence not a strategic goal? What is the plan to recruit, retain, develop, and review teachers, and how will it be measured? 2. How were 351 community thoughts compressed to seven forum priorities and then to three strategic goals? Who decided, against what criteria, and can the community see the decision log? 3. The February deck identified Healthy Meal Options, Homework, Parent Communication, and Robotics Club as priorities. None appear in the current goals. Why were they dropped, and where in the plan will they live? 4. The platform reports 718 "Participants." How is that number defined — unique humans, unique logins, unique sessions? What deduplication method does ThoughtExchange use, and how was it verified? Of the 718, how many are estimated to be unique adults in unique households? 5. Star ratings of 4.1, 4.2, and 4.4 are being treated as a clean ranking. What is the published statistical methodology behind that ranking — confidence intervals, inter-rater variability, significance testing? 6. What independent education research informs the plan's goals and targets? What district data — Jericho's own enrollment, achievement, attendance, mental health prevalence — supports the targets being set? 7. What is the baseline for "non-educational screen time" today? What instrument will measure it? Where do the 30% and 20% targets come from? 8. Was there a pilot of any of the proposed interventions before committing to a three-year, district-wide rollout? If not, how is the district managing the risk that the interventions don't work as expected? 9. Walk us through each of the six objectives: which are operational tasks (filter rules, policy updates, single-day training, audits, personnel hires) that district IT and administration could implement within 30-90 days? Why are operational items packaged with three-year timelines, and where in the plan is the actually strategic work — teacher recruitment and retention, curriculum redesign, clinical mental-health capacity? 10. Where in the survey is the community demand for an AI integration plan? Which thoughts and what star scores supported elevating AI above teacher excellence? 11. The May 7 SOAR rates "Dedicated Staff Teachers" as a 4.5-4.6 Strength — the highest rating in the deck — and Technology & Innovation as a 3.5 Aspiration — the lowest. Why is the 3.5-rated aspiration a strategic goal while the 4.5-4.6 strength has no goal? 12. Of the six aspirations on the deck's own Aspirations slides, only two map to goals (Mental Health → Goal 1; Tech & Innovation → Goal 3.2). Where in the plan are Character Building (4.3), Community Engagement (4.4), Academic Excellence & Balance (4.3), and Holistic Development reflected? 13. Healthy Meal Options was one of seven "Identified Priorities" in February and is gone from the May themes and goals. Why was it dropped, and where in the plan will lunch quality, time, and portions be addressed? 14. What other inputs informed the plan beyond the community survey — teacher voice, student voice, peer-district comparators? Where can we see them? 15. The March 17 forum flyer said the community would "help shape district priorities." Seven weeks later, the May 7 deck still omits teacher excellence and other top community priorities. What input was gathered at March 17, how was it documented, and how specifically did it influence the goals presented on May 7? 16. Will the community see a "you said / we will do" matrix mapping the survey themes to the plan responses? When? 17. Twenty-two percent of survey participation came in a language other than English. What is the plan for communicating the strategic plan and gathering feedback in those languages? 18. What is the cost and duration of TLE Group's continuing engagement after the plan is published, and why does implementation require ongoing external coaching? |
The harder question: engagement or theatre?
In the public-engagement literature, Sherry Arnstein's 1969 "ladder of citizen participation" — still a touchstone of the field — distinguishes between true engagement and tokenism. The bottom rungs of the ladder are non-participation. The middle rungs are tokenism: informing, consulting, and placating, where the public is heard but cannot influence the outcome. The top rungs are partnership, delegated power, and citizen control, where the public has real authority over decisions.
Measure Jericho's strategic-planning process against that ladder. A community survey produced 351 thoughts. An unseen synthesis compressed those to seven priorities. The community gathered at the March 17 forum to "help shape district priorities." Another unseen synthesis compressed everything to three goals and six objectives. On May 7, the community received the result. There is no published synthesis methodology. There is no documentation of how March 17 input shaped the May 7 deck. There is no formal feedback-and-revision step between May 7 and Board adoption. There is no community veto. There is no "you said, we will do" matrix.
On Arnstein's ladder, that sits squarely on the tokenism rungs — somewhere between consulting and placation. It produces the experience of being heard without the structural mechanisms that make hearing real.
It is possible to make a community feel consulted without giving them any actual influence over the outcome. The architecture of this process makes that outcome the most likely one.
It is worth being precise about why this matters. The objection is not that parents disagreed with a particular goal. It is that the structure of the process — survey, opaque synthesis, draft slides, ratification forum, board presentation — is the structure of a process that can produce the appearance of consultation without the substance of it. Whether the people running the process intend that or not, the architecture makes that outcome the most likely one.
And note what is at stake in the meantime. The plan commits to specific changes in how children are taught, how technology is used in their classrooms, what mental-health interventions they receive, and how their school is secured — for three years, beginning in 2026-27 — with no published baselines, no cited research, no piloted interventions, and no comparator data. Jericho's children are not a control group. A strategic plan that proceeds without those guardrails is not strategy. It is, at best, a well-intentioned experiment.
Repairing this does not require starting over. It requires the district and TLE Group to do six things, in order, before the plan is finalized:
Restore teacher excellence to the goal architecture, with its own goal, objectives, and metrics.
Publish the synthesis criteria so the community can see how 351 thoughts became three goals — including, explicitly, why each excluded priority was excluded.
Add the inputs that are missing — district data, educator voice, student voice, independent research, comparator districts, and pilot evidence — and cite them in the plan.
Replace headline targets with baselined, instrumented metrics. If a target cannot yet be baselined, say so, and commit to producing a baseline before the target is set.
Treat the Board of Education public presentation as a draft to revise, not a draft to ratify. Publish the input gathered at the March 17 forum, show how it shaped (or did not shape) the May 7 deck, and commit to a further revision before adoption — with a visible record of what changed and why.
Commit to a multilingual communication and feedback channel for the duration of the plan.
None of these are radical asks. All of them are standard practice in well-run public-sector strategic planning. None of them are in the current draft.
The community's job, between now and the Board presentation, is to make those six things impossible to skip.
