TY - JOUR AU - Adams, Brian, E AB - Abstract A hypothesized benefit of decentralization is that it will promote policy experimentation, yet there have been few studies examining this link. In this article, I identify three pathways through which decentralization could plausibly lead to greater experimentation and empirically assess their presence through an analysis of California’s Local Control Funding Formula (LCFF), a policy that enhanced the fiscal autonomy of local school districts. I find mixed results; even though there was some evidence to suggest greater policy activity, most new actions taken by school districts were incremental changes. Further, there was little ideological differentiation in adopted policies despite variation in districts’ partisan composition. Small districts were slightly more likely to experiment than larger ones, supporting the hypothesis that smaller jurisdictions are more nimble and flexible and thus more likely to enact policy change. I conclude by exploring how the incentives and preferences of local officials mediate the causal connection between decentralization and policy experimentation. A long-standing argument in the literature is that decentralization promotes innovation and experimentation. In a frequently cited quote, Justice Louis Brandeis wrote in New State Ice Co. v. Liebmann, 285 U.S. 262 (1932), “It is one of the happy incidents of the federal system that a single courageous State may, if its citizens choose, serve as a laboratory; and try novel social and economic experiments without risk to the rest of the country.” Having a variety of state and local experiments will yield better results than a centralized policy because effective programs can be discovered through comparison and will diffuse across jurisdictions. Even though there is a body of research on whether policies diffuse (e.g., Walker 1969; Berry and Berry 1990; Mintrom 1997; Shipan and Volden 2008), there is minimal theorizing and scant empirical evidence examining the notion that decentralization leads to more experimentation. The central goal of this article is to begin to fill this gap by developing and testing a theory on this relationship. My case study is the Local Control Funding Formula (LCFF), a 2013 California law that replaced over forty categorical grants with a block grant to local school districts. Even though districts are not completely free to adopt policy as they please, this law represents a significant increase in school district fiscal autonomy. The architects of the law hoped that it would encourage districts to adopt innovative programs to better serve the needs of their students, under the assumption that they can better decide how to educate children than legislators in the state capitol. I examine whether this is the case by exploring the extent and ideological range of experiments enacted in a random sample of school districts. I find that even though districts adopted some new policies, they tended to be incremental changes rather than substantial experiments. The fiscal autonomy created by the LCFF has led to differentiation across districts, with districts having various emphases and utilizing divergent approaches, yet they rarely pursued major departures from the status quo. Further, there was minimal ideological differentiation across districts. Even though conservative districts have the capacity to counter more liberal state policies, none in the sample did so. There was more experimentation in smaller districts than larger ones, suggesting that smallness creates a nimbleness and flexibility that fosters a willingness to experiment. The LCFF has allowed districts to differentiate themselves by choosing dissimilar emphases and subscribing to divergent programs and pedagogical techniques but has not fostered wide-ranging experimentation. In the last section of the article, I use these findings to further develop the theory of how decentralization can promote experimentation. Even though the LCFF enhanced the capacity for experimentation, the preferences and incentives facing school district leaders militated against it. The causal connection between decentralization, which is a reform that changes the locus of policymaking authority, and policy outputs runs through the preferences, incentives, and capacity of officials (Adams 2016). The inability of the LCFF to foster widespread experimentation should not be taken as an indictment of decentralization in general. Rather, it highlights that the effect of decentralization on experimentation is contingent on the resources and constraints under which policymakers operate. This article does not focus on documenting the diffusion of a specific education policy but rather assessing the tendency of school districts towards experimentation. As Boehmke and Skinner (2012) illustrate, diffusion research can either focus on one policy or examine innovativeness as a characteristic of governmental jurisdictions. In this article I do the latter, analyzing whether school districts use their newly acquired powers to pursue innovative policies. Towards that end, I define an experiment as an action that is new to the district, consistent with the use of the term in the diffusion literature (Walker 1969; Berry and Berry 1990; Mintrom 1997; Shipan and Volden 2008; Boehmke and Skinner 2012). This definition does not fully capture the commonly understood meaning of experimentation, which connotes that an entity is trying something novel and assessing its merits; operationalizing “novel” in a meaningful way in not feasible given that almost all education reforms have been tried somewhere else before. Thus, this article does not assess whether the LCFF has led to innovative and novel approaches to education, but rather whether it has encouraged policy change. The Causal Logic of How the LCFF could Lead to More Innovation Overview of the LCFF The LCFF was enacted in 2013 (and fully funded in the 2018–19 school year) with two primary goals. The first was to change the means through which state funding was allocated to schools. Prior to the LCFF the state distributed funds on a per-pupil basis supplemented by various pots of money from categorical grants. Thus, per-pupil state funding was fairly constant across districts, although districts that were successful at applying for categorical grants could boost their spending. The LCFF changed the formula so that districts with more students who are low-income, English-learners, or foster youth receive additional funds, under the premise that these groups are more expensive to educate due to higher need and academic challenges. An “unduplicated student” count is determined for each district based on the number of students in each of the three groups, but a student in more than one group is only counted once (hence, not duplicated). Schools receive a per-pupil base grant from the state and supplemental funding based on the unduplicated student count. Schools with over 55 percent unduplicated students also receive additional funds called a concentration grant. Districts receiving supplemental funds and concentration grants are required to increase services to unduplicated students in proportion to the increase in funding they receive but have flexibility in how they do so (Vasquez Heilig et al. 2014).1 This funding formula is designed so that schools with more low-income students, English learners, and foster youth receive greater funding.2 The second goal of the LCFF, and the focus of this article, is to devolve fiscal authority to school districts. The new law eliminated over forty categorical grants, folding them into general funding. Despite some requirements—such as increased support for unduplicated students, stakeholder engagement (Humphrey et al. 2018; Marsh and Hall 2018), and implementation of common core standards—the LCFF gave districts more flexibility in how to spend state funds. Even though districts still have to follow a wide range of state and federal mandates, the LCFF brought about a real devolution of fiscal authority to school districts. Plus, most districts’ unrestricted funds have increased, giving them additional options for spending funds; simply receiving greater funding from the state will enhance fiscal autonomy. Fuller and Tobben (2014, 5) summarize the hoped-for dynamic this way: “The state’s 900-plus school districts—granted flexibility and fresh funding under the Local Control Funding Formula (LCFF)—now embark on a colorful variety of strategies to raise achievement and reduce disparities. By providing fungible allocations to local school boards and lifting regulatory controls, Sacramento invites district leaders to craft their own approaches for improving schools.” Below I explore three ways that a devolutionary policy such as LCFF could plausibly lead to more experimentation: increasing the number of jurisdictions making policy, expanding the ideological range of experimenters, and greater flexibility created through a reduction in jurisdiction size. Increasing the Number of Jurisdictions The number of governments that create policy can influence aggregate levels of policy change. In a decentralized system, the number of jurisdictions that can enact new policies increases substantially—instead of one central government, you now have hundreds or thousands of local governments. A small percentage, say 1 percent, of local governments may enact new policy, but this will probably represent a significant increase in activity compared to the central government. Central governments can also enact new policy, but their capacity to do so is limited, as they can only have so many pilot programs at once and only so many efforts at policy change on the agenda at a given time. Even if the average local government is less active than its national counterpart, there may still be more activity on a local level due to their sheer numbers. In short, the larger number of policymaking bodies created by decentralization increases aggregate policy activity independent of officials’ preferences for change. Increased policy activity by local governments could have two related but analytically distinct effects: experimentation and differentiation. Local governments can use their ability to enact policies to pursue something substantially different than what they were previously doing. Alternatively, they can make incremental changes to existing policy, changing practice without shifting their approach or emphasis. Regardless of whether we use the stronger definition of experimentation (implementation and assessment of a novel idea) or the weaker one (something different than what was previously adopted), experimentation implies that the newly enacted policies are more than incremental changes to existing practice. However, even if policy change is incremental, it can have the secondary effect of differentiation across jurisdictions, as they diverge in the specifics of their approach to a policy issue. Experimentation will create differentiation (unless all jurisdictions adopt the same ones), but it is possible that decentralization will prompt differentiation without experimentation; districts may change their activities without adopting policy that is substantively different than the status quo. There are two conditions under which decentralization would lead to neither greater experimentation nor differentiation. First, if local officials do not have the incentives or preferences to enact new policy, capacity will be underutilized resulting in less policy activity than on higher tiers. Secondly, a more likely scenario is local officials enacting similar, well-established policies. If experiments enacted by local governments are similar to each other, the benefits of having many jurisdictions empowered to make decisions will be minimized. The argument that decentralization increases aggregate experimentation rests on the premise that local jurisdictions will choose different paths; having hundreds of similar experiments is not much different than have one statewide one. I expect that the LCFF will increase both the amount of experimentation and differentiation in California’s education system overall. The state government is capable of passing dozens of education-related bills in a given year, but local school districts can introduce hundreds of reforms. Even if there are some school districts that prefer the status quo, only a portion of school districts need to experiment to create a large number of new policies. Further, school districts in California vary in their ideological makeup and local conditions. Because school districts have incentives to adopt new policies and have diverse preferences, the LCFF should increase aggregate levels of experimentation or at the very least some differentiation across districts. The Ideological Range of Experiments A second way that decentralization can increase innovation is by increasing the ideological range of experiments. There are political boundaries on central government experimentation. In addition to the point just mentioned regarding a limit to how many policy experiments a single government can conduct at once, there is also a limit imposed by the ideological and political preferences of government officials. A government will enact policies that are consistent with its ideology and policy platform, which significantly narrows what they are willing to try; you certainly will not see conservative governments trying liberal policy experiments and visa-versa. Because local governments vary in their ideologies they will collectively enact a wider range of policy experiments than a single central government. Scholars such as Treisman (2007) are correct that central governments can, and do, innovate, but there are political forces that limit the number and range of experiments they will conduct at any given time. As Oates (1999, 1133) suggests, “one might suspect that relatively independent efforts in a large number of states will generate a wider variety of approaches to public policy than a set of centrally designed experiments.” The fiscal federalism literature argues that one benefit of decentralization is that it allows for a better match between policy and preferences. Diversity within a state may mean that citizens in one region desire or need different public policies than those in others, and thus a central, uniform policy may not be optimally efficient (Hooghe and Marks 2003, 236). There are three causal pathways for how decentralization can lead to policy that is more accommodating of heterogeneous preferences. The first is that smaller size creates informational resources so that local officials have a better grasp of what citizens prefer and can therefore design policies to better meet public needs.3 Another possibility is that local officials are more accountable because localities are smaller in size, translating into a better match between policy and citizen preferences (Faguet 2012). Finally, local delivery of public goods will be most efficient because having more decision-making bodies allows for a better match between voter preferences and the provision of public goods and services (Oates 1972,, 1999). All of these arguments rest on the assumption that local jurisdictions will adopt new policies consistent with local preferences, with the end result being an ideologically wide range of policies being enacted. I expect that the LCFF would lead to a wider range of experiments than those enacted on the state level. Given Democratic Party dominance within the state government, education policy is left-of-center. However, some school districts within the state, especially those in rural areas, are conservative and likely to implement policies consistent with their ideology. Couple that with some very liberal urban districts, and the expectation is that there will be a wide range of experiments across the ideological spectrum. Smallness The third causal path through which the LCFF could promote experimentation is by shifting policy authority to smaller governmental entities. Even though there are no scholarly studies that support the notion, supporters of decentralization often contend that smaller governments have greater capacity to experiment because they are more flexible and less bureaucratic. Large organizations are difficult to change simply because they are so big; the oft-used analogy is that it is like turning a battleship. Smaller jurisdictions, on the other hand, may be more willing to try new ideas and can more easily change course. This logic has been applied to the education policy arena through the small schools movement championed by the Gates Foundation and other reformers. In the case of the LCFF, we should see variation in experimentation across school districts, which in California range from less than 20 students to tens of thousands. If there is a benefit to smallness, there should be greater experimentation in smaller districts, all else being equal. Smaller jurisdictions, however, have two disadvantages that could limit their capacity to experiment. First, they have less technical expertise. Economies of scale may lead to the aggregation of technical expertise centrally that is lacking on a sub-central level. Small, rural governments may not have the technical competence to formulate and implement new policy while a central government would. However, in the case of educational reform this is unlikely to be a barrier to experimentation. Even though small school districts have less technical capacity, the problems they face are on a more manageable scale. For example, a small school district of 500 students may have an easier time experimenting with a new program (for example, an after school program) than a larger district of 30,000. The smaller jurisdiction would likely have less technical competence on the issue, but the smaller scale makes the logistics easier and the design choices less complicated. The loss of technical capacity may be outweighed by the reduction in the complexity and tractability of the problems the jurisdiction faces. A second disadvantage to smallness is a lack of resources (Humphrey et al. 2017). Small school districts may not have the resources they need to develop and implement experiments, particularly low-income districts. The diffusion literature has hypothesized and found support for the idea that wealthier jurisdictions are more likely to innovate (Walker 1969; Berry and Berry 2014, 322). The LCFF, however, provides additional funds to districts with large number of low-income students, potentially counteracting a deficit of other types of resources (such as donations to school foundations). Both small and large districts may struggle to find resources to enact new programs, and I do not expect that a lack of resources will be a larger barrier to experimentation in small districts. More generally, even though there are some downsides to smallness, the expectation is that it generally promotes experimentation. Data and Methods As part of the LCFF, school districts are required to prepare and submit a “Local Control and Accountability Plan” (LCAP) every year. The LCAP is a public document where districts assess past actions and describe future actions they will undertake to improve educational outcomes. The LCAPs will be the primary source of information for the analysis below, focusing on the “Goals, Actions, and Services” section of the report. Here, districts identify a series of goals (usually between three and five) and then list planned actions and services they will pursue to accomplish them. The goals must align with eight state priorities but districts have wide discretion in the types of actions they pursue (Vasquez Heilig et al. 2014). They provide a written description of the actions, who the action is intended to benefit (all students, English learners, etc.), and a budget. This creates an opportunity to explore the extent and nature of experimentation within school districts, as the forms prompt districts to list all the actions they are undertaking to improve education. If a district is engaging in an experiment or new program it likely will appear on the LCAP; not only are they asked to list them but they have every reason to do so, as there is no benefit in keeping actions off the LCAP. On the other hand, even though experimentation is encouraged it is not required; districts do not have to identify new actions (and many do not). The analysis focuses on 2017–2018 LCAPs from a random sample of 69 unified school districts in California (a 20 percent sample of all unified school districts). California has three types of school districts: elementary, high school, and unified. I chose to focus on the latter because elementary districts (which are the majority of districts) tend to be small, with most large cities opting for unified districts. As a result, elementary districts do not provide the size variation needed for this analysis. Unified districts, on the other hand, vary from exceedingly small (less than fifty students) to over 700,000 students in Los Angeles Unified. Elementary and high school districts may have different policy dynamics than unified ones, and thus this study’s findings may not be generalizable to all districts. However, a focus on unified districts presents the best opportunity to examine how the LCFF affected policymaking on the local level, and limiting the analysis to just one type of district facilitates comparisons across districts. The LCAPs present a list of actions the school district plans to undertake in the upcoming three-year period. I coded these actions as either new or not new by using a code found on the LCAP supplemented by the presence of “change” words. Starting with the 2017–2018 LCAPs, districts were required to identify whether an action item was “new,” “modified,” or “unchanged.” However, not all of the new items merit the designation; in some case they were new to the LCAP but not the district, identifying routine or ongoing activity. Further, of those that indicated something new, there was variation in the extent of the change. To better capture whether an action item constitutes a new initiative, I recoded this variable. All unchanged or modified items were coded as 0; because most modified items were almost identical to previous year items (often just changing funding), they were not considered new actions. New items were coded 0, 1, or 2 based on the presence of two status quo verbs and twenty-four change verbs. The coding was done solely on the basis on the presence or absence of these verbs; there was no judgement exercised by the coder. New items that contained a status quo verb (“continue” or “maintain”) but did not contain a change verb were coded as 0. These actions may be new to the LCAP, but the presence of these status quo verbs suggests that they are not new to the district. New action items that contained neither a status quo or change verb, or contained both, were give a code of 1. Many of these items were either proposals to study an issue or were calls to plan for action. If an action item contained a change verb but no status quo verb, it was given a value of 2. Examples of how action items were categorized and the full list of change verbs are provided in table A2 of the Appendix. Even though it is a bit crude, this coding scheme helps to distinguish new policies from routine actions. The sum of these codes for each district, which I will refer to as an experimentation score, is a reasonably good measure of the extent of new policy activity in a district in a given year. LCAPs from 2014 to 2015, the first year of the new law, were also collected to examine changes over time. Unfortunately, direct comparisons between the 2014–2015 and 2017–2018 LCAPs are not feasible because of changes in the law as well as the LCAP form itself. The older forms instructed districts to list action items in a table rather than separately, and also have separate sections for unduplicated students and the student body as a whole (prompting many districts to repeat all of their action items). Further, there was no new/modified/unchanged checkbox on the old forms. Even though these differences prevent a quantitative comparison of LCAPs over time, the older documents are useful in highlighting trends and illustrating how school district activity has changed as the law was implemented. In addition to the information from the LCAPs, other district-level variables were added. Enrollment figures, the percentage of unduplicated students, and the number of low-income students (as measured by qualification for free or reduced lunches) were acquired from the California Department of Education. Party registration data was also calculated for each district. Party registration data was not available for every district; for those where it was not, the political jurisdiction whose borders most closely resemble the districts was used. For most urban districts that was a city or multiple cities. For rural districts in unincorporated areas, the County Supervisorial district in which it is located was typically used. Descriptive statistics for these variables can be found in table A1 of the Appendix. The analysis below will be largely qualitative (with the exception of an ordered logit regression examining the effect of size on experimentation). The central evidence that can explore the connections described above is the text of the action items on the LCAPs, which do not easily lend themselves to quantitative analysis; given their open-ended nature, classification is challenging (even though there were some opportunities for classification, such as the experimentation scores described above). Action items were coded using HyperResearch, a qualitative software tool. HyperResearch allows researchers to label passages of text, providing organization and allowing for a more systematic review or large amounts of text. For example, every time an action item mentions professional development it can be coded as such, and then the researcher can pull out all of those instances and examine them together. This allows the researcher to qualitatively and systematically analyze what school districts are doing in relation to professional development, helping to avoid problems of cherry-picking evidence or using unrepresentative examples. HyperResearch was used in this analysis to facilitate the identification of patterns in the types of action items presented. Even though the LCAPs present a rich source of information to examine school district experimentation, there are also limitations. The LCAPs do not indicate what activities school districts engaged in prior to the LCFF, preventing a direct test of the proposition that the LCFF increased experimentation. Given this limitation, my approach is to explore what types of new actions districts engaged in and the effect they have on differentiation across districts. Even though I do not have the data to directly assess whether the LCFF increased experimentation, I can examine whether the LCFF was successful in encouraging districts to take varying approaches to educational improvement. This illustrates the extent and type of change prompted by this fiscal decentralization reform. Analysis The sixty-nine districts listed a total of 2,382 action items, an average of about thirty-five per district (although there was substantial variation in the number of items, ranging from a low of five to a high of ninety-two). There were some common action items that appeared in almost all of the districts. Most had a few items regarding adopting standards or instructional material consistent with Common Core as well as Career and Technical Education (CTE), actions prompted by state and federal policy. Districts also typically included action items providing targeted support for unduplicated students, demonstrating how they use supplemental funds and concentration grants. They varied in how they chose to provide help: instructional aides, guidance counselors, online credit recovery programs, summer school, and tutoring programs were frequently listed. One particular common action item (mentioned by thirty-three districts) was related to AVID (Advancement Via Individual Determination), a nonprofit program that trains staff and provides resources to schools. Most districts also had action items on a standard set of activities: efforts to increase attendance, improve behavior, maintain facilities, enhance technological resources, and increase parent involvement. The most common action was professional development for teachers and administration, with 400 action items (17 percent) including some professional development aspect. Districts varied in their emphasis: some stressed professional development while others focused on interventions or behavior. In general, there were some common themes that ran through the LCAPs, although each district presented a unique set of action items. Of the 2,382 action items, 11 percent (273) were identified as new. There were fourteen districts that did not have any new action items, and the median number of new items was only three. Just six districts accounted for 38 percent of all new items, indicating wide variation in the proclivity towards new actions. Of the 273 new action items, fourteen were coded 0 using the experimentation codes described above, eighty-one were coded 1, and the rest (178) were coded 2, not surprisingly given that one would expect most new actions would, in fact, involve some type of change. These aggregate numbers, however, do not tell us much, as an experimental district could have just one new action item that is a major initiative. Below I examine the LCAPs more closely to examine the evidence related to whether the LCFF has increased experimentation or prompted greater ideological variation, as well as whether smaller districts are more likely to adopt new policies. The Aggregate Amount of Experimentation The 273 new action items were generally not experiments that substantially altered a school district’s approach, but rather incremental changes to existing policy. Many new action items were minor changes to existing programs or initiatives, such as adding an extra teacher to a particular grade level or hiring more paraprofessionals, school psychologists, or nurses. For example, Calipatria Unified listed “Staffing for summer school instructional support for students struggling with mathematics grade level achievement” as one of their new action items. There were, however, action items that did lead to a new program or initiative. Some were targeted towards increasing student academic or behavioral support. For example, Temple City Unified identified this new action: “The District will provide a computer adaptive program, iReady, to support instruction and learning in Math and ELA. Further, the program will be used to monitor and support students and student groups considered to be at risk.” Other districts engaged in new professional development initiatives, such as Piedmont City Unified which “Provide[d] K-12 professional development to support NGSS implementation and the integration of Science, Technology, Engineering, Arts, Mathematics (STEAM) using a variety of approaches: local conferences and workshops, consultant services that provide training, coaching, and lesson study.” Not all action items were directly related to classroom instruction. Lone Pine Unified, for example, stated “A student-driven behavior monitoring system will be put in place at the elementary school to encourage positive behavior on the playground.” Districts did undertake new actions: they created programs, changed curricula, and altered teaching practices. The majority of districts can point to something new that they did in an effort to improve educational outcomes. However, this activity was within narrow bounds. Missing from the list of new action items are experiments in the classic sense, where a school district engages in a series of reforms in a coordinated effort to improve educational outcomes. Even in districts that had many new action items the reforms were piecemeal and isolated rather than a coherent experiment. For example, Norwalk-La Mirada identified twelve new action items including professional development for administrators, improvement in facilities, more field trips, instructional support for teachers, and enhancing college readiness programs. Even though they are policy changes, these actions do not constitute a substantive or coordinated departure from existing practice. Incremental change, in the way that Lindblom (1959,, 1979) used the term, is the most accurate way to characterize the new actions that districts undertook: districts made slight modifications to existing practice rather than embark on new paths with the application of novel theory or experimental programs. Putting together the incremental nature of new action items with the fact that 90 percent of the action items were continuations of existing actions, the picture that emerges is one of modest change. Despite the incremental nature of change, the new action items districts pursued created differentiation across districts. Districts chose to emphasize different activities. Some focused on professional development, others on curriculum change, and still others on student behavior. They may be minor changes, but when districts emphasize different activities, they create distinct approaches to educational improvement. The LCFF has not led to districts being “laboratories,” since the reforms undertaken are incremental and limited. However, it has allowed districts to take divergent paths to reform. This trend holds when comparing the 2017–2018 LCAPs with those in 2014–2015. Most districts had substantive changes to their LCAPs over this period, as the majority of action items appearing in 2017–2018 were not present three years earlier. There were a few districts where the LCAPs were mostly unchanged over time. For example, in Glendora, a suburban district outside of Los Angeles, about one-third of their action items in 2017–2018 were cut-and-pasted from one year to the next and another third were slight rewordings of older action items. The remaining third were new to the LCAP, but many were general activities that the district probably was doing prior to listing them on the LCAP. For example, one new action item was “The district will recruit and retain qualified SPED [special education] teachers and staff members, ensuring adequate staffing for at-risk student supports and UDP population.” Of the items that were new initiatives, most were incremental changes to existing policy, including items such as extending library hours, purchasing supplemental instructional material for English learners, and spending additional funds on campus security. Even though there are differences between the 2014–2015 and 2017–2018 LCAPs, the changes were largely incremental. Glendora is unusual in the repetition of action items over time, as most action items in 2017–2018 were not present in 2014–2015. However, its LCAP is typical in that the new items were modest changes to existing practice. Districts provided additional funds for professional development, purchased new technology, created new facilities maintenance plans, hired new administrators, and developed new initiatives to increase parent engagement. There were also some popular new action items—such as expanding career and technical education courses, adopting or expanding the AVID program, and implementation of restorative justice practices—that reflect recent trends in educational reform. This comparison of LCAPs over time indicates that most districts do change their policies and implement new initiatives. Yet, there were no districts where there was a systematic or fundamental change in its approach to reform. We might expect the most experimentation to occur in districts that struggle academically. The lowest-performing district in the sample was Coalinga-Huron Unified, a district of 4,400 students in Fresno County. According to 2017 data published on the California Schools dashboard website, it was one of the lowest performers in math and near the bottom in English. Over 80 percent of students are unduplicated pupils, which means that since the LCAP was passed in 2013 it has received additional funding. The combination of low performance and additional funding should create conditions ripe for new initiatives. As with most districts, the majority of action items in 2017–2018 were not present on its 2014–2015 LCAP. The new actions items include efforts to expand Advanced Placement courses, funding intervention aides, funding the library, hiring science and technology coaches, and an additional 1.5 hours per month of professional development. All of these items were modifications of existing programs or the creation of new initiatives that did not fundamentally alter their approach to education. Further, the reforms were piecemeal, with no overarching structure or coordination. This district, like most others in the sample, implemented modest changes to existing practice rather than substantive experimentation. An alternative approach to examine the extent to which districts adopted new policies is to focus on those with the highest experimentation scores. There were five districts with experimentation scores of over twenty: two wealthy, high-achieving suburban districts (Piedmont City and Walnut Valley), one diverse suburban district (Norwalk-La Mirada) and two rural, low-performing districts (Tulelake Basin Joint and Calipatria). Many of the new items created by these districts were either repackaged items from the previous year or minor variations of existing activities. For example, one of the new action items on Piedmont City’s LCAP for 2017–2018 was to review and implement new content standards in the health curriculum. But the district had a similar item on the 2016–2017 LCAP that was a little more general but also addressed the issue of health content standards. Similarly, Calipatria’s 2016–2017 LCAP stated that the district will “Purchase materials and supplies to promote teacher retention and campaigns across district sites.” In 2017–2018 they listed the following action as new: “Provide support to promote teacher retention and beginning teachers.” This is technically new, as the older item just focused on purchasing supplies, but it is not a major departure from existing practice. This is true for most of the new action items in these five districts: they built upon or expanded upon previous items. There were some genuinely new actions. For example, Tulelake Basin Joint created a summer credit recovery program and Norwalk-La Mirada proposed to implement common core state standards. Yet, the overall picture that emerges is that these five districts—the ones that had the highest experimental scores in the sample—did not embark on major reform efforts or implement substantive experiments. Comparing school district actions to state legislative activity can also provide insight into the effects of the LCFF on policy adoption. This comparison is not ideal because the LCAPs are budget documents rather than a summary of legislative activity, and state policies derived from administrative decisions will not be captured in legislative activity. Despite this, it can provide some insight into the effects of the LCFF on aggregate experimentation. State-level data were acquired from the National Conference of State Legislatures’ education database, which covers the years 2008–2017. Only enacted legislation on a topic that could plausibly appear on an LCAP was included. Thus, bills addressing items such as teacher credentialing, on which school districts have no say, were excluded. On average, between 2008 and 2017 California enacted 16 new laws a year that were related to LCAP topics, with a yearly low of five and high of thirty-four.4 As with school district actions, state laws were often minor changes to existing policy, such as modifying grant programs or adjusting existing regulations. In the aggregate there is more activity within local school districts: even though the average district may not have many new action items, in raw numbers there are more new action items listed in the LCAPs (273 in 2017–18) than enacted state legislation. Individually school districts may not be prone to experimentation, but by sheer numbers they produce more new policies that the state typically does. Another way to examine how the LCFF has allowed for differentiation across districts is to analyze the extent to which districts departed from the forty-three categorical grants that it eliminated. The elimination of these grants was a means through which the LCFF created more flexibility: rather than pushing school districts to engage in certain activities, the LCFF allowed them to decide for themselves what policies to implement. Table 1 lists the number of districts that had action items commensurate with the eliminated categorical grants. Not all of the forty-three categorical grants appear on the table because some were not specifically targeted towards school districts or would not be appropriate action items for other reasons. For example, two grants were related to the California High School Exit Exam, which is no longer administered. A few provided funds to County Offices of Education, a few others to charter schools, and some were for activities that schools are still required to do (for example, a grant providing funds for Oral Health Assessments). Turning to the grants that are listed, for three of the grants (related to school safety and arts and music) a majority of districts had action items that were commensurate. The goals of a few other grants were endorsed by a sizable minority of districts, but for the majority of grants districts chose to prioritize other activities.5 Overall, districts varied in whether they continued to focus on the priorities enshrined in the categorical grants. Replacing the categorical grants with the LCFF has allowed districts to emphasize diverse priorities, creating differentiation. In sum, because there are hundreds of school districts but only one state, devolving power has created the conditions for a greater level of experimentation. But districts did not use their newfound power to enact experiments that departed substantially from existing policy. Districts stayed on well-worn paths and preferred incremental change. Those changes, however incremental, have led to differentiation across districts. Even though we do not see substantive departures from the status quo, the LCFF has allowed for differentiation that can allow practitioners and scholars the opportunity to compare different approaches to educational reform. Table 1. Action items consistent with categorical grants Categorical grant . No. of districts with commensurate action items . Administrator training program 11 Adult education 13 Arts and music block grant 41 Bilingual teacher training assistance 0 Cal-SAFE (assistance to teen mothers and expectant mothers) 7 Certified staff mentoring 30 Civic education 0 Class size reduction 30 Community-based English tutoring 5 Deferred maintenance 9 Gifted and Talented Education (GATE) 18 International Baccalaureate/Advanced placement fee reimbursement 26 Peer assistance and review 11 Physical education teacher incentive grants 5 Reader services for blind teachers 0 School safety block grant/school safety consolidated competitive grant, school community violence prevention 44 Categorical grant . No. of districts with commensurate action items . Administrator training program 11 Adult education 13 Arts and music block grant 41 Bilingual teacher training assistance 0 Cal-SAFE (assistance to teen mothers and expectant mothers) 7 Certified staff mentoring 30 Civic education 0 Class size reduction 30 Community-based English tutoring 5 Deferred maintenance 9 Gifted and Talented Education (GATE) 18 International Baccalaureate/Advanced placement fee reimbursement 26 Peer assistance and review 11 Physical education teacher incentive grants 5 Reader services for blind teachers 0 School safety block grant/school safety consolidated competitive grant, school community violence prevention 44 Open in new tab Table 1. Action items consistent with categorical grants Categorical grant . No. of districts with commensurate action items . Administrator training program 11 Adult education 13 Arts and music block grant 41 Bilingual teacher training assistance 0 Cal-SAFE (assistance to teen mothers and expectant mothers) 7 Certified staff mentoring 30 Civic education 0 Class size reduction 30 Community-based English tutoring 5 Deferred maintenance 9 Gifted and Talented Education (GATE) 18 International Baccalaureate/Advanced placement fee reimbursement 26 Peer assistance and review 11 Physical education teacher incentive grants 5 Reader services for blind teachers 0 School safety block grant/school safety consolidated competitive grant, school community violence prevention 44 Categorical grant . No. of districts with commensurate action items . Administrator training program 11 Adult education 13 Arts and music block grant 41 Bilingual teacher training assistance 0 Cal-SAFE (assistance to teen mothers and expectant mothers) 7 Certified staff mentoring 30 Civic education 0 Class size reduction 30 Community-based English tutoring 5 Deferred maintenance 9 Gifted and Talented Education (GATE) 18 International Baccalaureate/Advanced placement fee reimbursement 26 Peer assistance and review 11 Physical education teacher incentive grants 5 Reader services for blind teachers 0 School safety block grant/school safety consolidated competitive grant, school community violence prevention 44 Open in new tab Ideological Differentiation Given the liberal tendencies of the state government, devolution should lead to experimentation with conservative policies in districts that are strongly Republican. Even though there are limits imposed by state policy on enacting conservative policies, district have some flexibility; there are many pro-market, choice-oriented (short of vouchers) policies they could pursue, as well as the possibility of strengthening accountability measures. Creative districts determined to promote more conservative policies could do so even within the confines of a liberal state policy framework. I start my inquiry with a look at the five most conservative districts in the sample, as measured by percentage of Republican registered voters. These include four rural districts (Modoc Joint, Surprise Valley Joint, Tulelake Basin Joint, and Rim of the World) and one suburban district (Clovis). In all five districts registered, Republicans outnumber Democrats by at least a 3:2 margin. These districts, however, did not use their newfound powers to promote conservative policies. None of the districts emphasized—or even mentioned—school choice or explicit pro-market policies. There are no action items in their LCAPs that indicate these are conservative districts. More surprisingly, they had action items that seem to support more liberal-leaning policies. For example, Tulelake Basin Joint budgeted funds to reduce class sizes, an idea that is championed on the left but has received some criticism from conservatives (e.g., Hanushek and Lindseth 2009). Rim of the World included action items on hiring more bilingual aides, developing a green energy policy, and “Sensitivity Training…focusing on social justice, cultural proficiencies, respect and equity.” Expanding the analysis, there are twenty-five districts in the sample where Republicans outnumber Democrats, yet none of them pursued a conservative agenda. There were a few conservative action items—for example, Conejo Valley promoted a homeschooling program—but they were few and far between.6 A comparison of the five most conservative districts with the most liberal ones in the sample (Oakland, Compton, Emery, West Contra Costa, and Hayward), indicates that there are not stark differences in the types of actions proposed. Both liberal and conservative districts implemented programs like AVID to help unduplicated pupils; they hired more guidance counselors or school psychologists; they created after-school tutoring programs, summer school, or credit recovery programs; and they altered curriculum by offering Visual/Performance Arts or CTE training. The specifics vary from district to district, but there is no significant ideological differentiation across districts. Some of the liberal districts did take actions that are opposed to a conservative viewpoint, most commonly restorative justice and dual language immersion programs, but these policies were also found in moderate and even some conservative districts. The LCFF has not created a wide ideological range of experiments; school districts tend to adopt a centrist, moderate approach. This undermines one of the central benefits of decentralization: creating variation in ideological approaches that allows for comparison and analysis. As described above school districts did vary in their approaches but this was a reflection of nuanced differences in preferences and style rather than ideological agendas. Districts tried different commercial programs and spent their money on diverse types of activities, but the differentiation was not along ideological lines. Size and Experimentation Table 2 presents the results of an ordinal logistic regression examining the relationship between district size and experimentation levels. The dependent variable is based on the experimentation scores described above and is comprised of three categories: the “low” category consists of districts with a score of zero (no new items), the “medium” category are districts with a score between one and six, and the “high” category are districts with a score above six. The independent variable of interest is district enrollment, which was also recoded into a low/medium/high scale, with a third of districts in each.7 Percent registered Republican and the percentage of students receiving free or reduced school lunches (as a measure of district wealth) were also included and divided into three equal categories. The results indicate that small districts (those with less than around 4,000 students) were more likely to list new action items than larger districts. Wealthier districts were also more likely to experiment, a finding consistent with prior research on diffusion (Walker 1969; Berry and Berry 2014). These regression results suggest that there are benefits to smallness when it comes to experimentation and that a lack of capacity does not deter small districts from adopting new policies. Further, the districts with the greatest institutional capacity—large, urban ones with extensive administrative staffs as well as attention from nonprofits and researchers—tended to have low experimentation scores: sixteen of the eighteen districts in the sample with more than 20,000 students had below-average experimentation scores. Table 2. The effects of size on experimentation Characteristics . B (SE) . OR . Enrollment (base = high) Low 1.934 (0.669)** 6.917 Medium 0.062 (0.619) 1.063 Republican (base = high) Low 0.170 (.630) 1.185 Medium −0.549 (0.606) 0.578 % Free lunch (base = high) Low 1.573 (.675)* 4.821 Medium −0.702 (.605) 0.496 Pseudo R2 (Cox and Snell) 0.228 N 69 Characteristics . B (SE) . OR . Enrollment (base = high) Low 1.934 (0.669)** 6.917 Medium 0.062 (0.619) 1.063 Republican (base = high) Low 0.170 (.630) 1.185 Medium −0.549 (0.606) 0.578 % Free lunch (base = high) Low 1.573 (.675)* 4.821 Medium −0.702 (.605) 0.496 Pseudo R2 (Cox and Snell) 0.228 N 69 The dependent variable is a district’s experimentation score (low, medium, high). * P < 0.05; **P<0.01; ***P <0.001. Open in new tab Table 2. The effects of size on experimentation Characteristics . B (SE) . OR . Enrollment (base = high) Low 1.934 (0.669)** 6.917 Medium 0.062 (0.619) 1.063 Republican (base = high) Low 0.170 (.630) 1.185 Medium −0.549 (0.606) 0.578 % Free lunch (base = high) Low 1.573 (.675)* 4.821 Medium −0.702 (.605) 0.496 Pseudo R2 (Cox and Snell) 0.228 N 69 Characteristics . B (SE) . OR . Enrollment (base = high) Low 1.934 (0.669)** 6.917 Medium 0.062 (0.619) 1.063 Republican (base = high) Low 0.170 (.630) 1.185 Medium −0.549 (0.606) 0.578 % Free lunch (base = high) Low 1.573 (.675)* 4.821 Medium −0.702 (.605) 0.496 Pseudo R2 (Cox and Snell) 0.228 N 69 The dependent variable is a district’s experimentation score (low, medium, high). * P < 0.05; **P<0.01; ***P <0.001. Open in new tab We can further explore the relationship between size and experimentation by comparing new action items in the five largest and five smallest districts. The five smallest districts (Desert Center, Death Valley, Alpine County, Owens Valley, and Surprise Valley Joint) have enrollments ranging from 15 to 108 students. Their new action items are similar to those in larger districts. Two districts (Desert Center and Surprise Valley Joint) listed adopting a new science curriculum aligned with the Next Generation Science Standards (NGSS), a common action item present in over a third of districts. There were some other new initiatives to help students academically. For example, Owens Valley created an after-school homework club and Surprise Valley enhanced their summer school program. These types of programs are also present in the largest districts (Long Beach, Capistrano, Santa Ana, Oakland, and Garden Grove). Santa Ana, for example, enhanced their dual immersion programs while Oakland experimented with a “math fellows” program. The larger districts tended to list more action items overall (an average of almost forty compared to seventeen for the smaller districts) and had more detailed entries, probably a function of more administrative staff to fill out the forms. But they had substantially fewer new items and there was no evidence that the depth of experimentation was any greater than their smaller counterparts.8 Another approach to exploring how size affects experimentation is a matched-case comparison analyzing two districts that are similar in all regards except for size. For this analysis I will compare Emery Unified, enrollment 690, and Hayward Unified, enrollment 22,734. Both are located in Alameda County, in the San Francisco Bay area, and have above-average family incomes (although Emeryville,9 with a median household income of over $84,000, is wealthier than Hayward, whose median income is $75,000). They have similar percentages of students who receive free or reduced-price lunches, about 70 percent. Academically, their rankings are comparable on the California Schools dashboard, rating Orange (second from lowest) in both mathematics and English language arts. Politically they are very liberal, with Emery having 6 percent and Hayward 10 percent registered Republicans. There’s not much to separate these districts except that Hayward is much larger (and a little less wealthy). Examining the details of their LCAPs, we see that Emery is a more active experimenter, adopting a small schools approach by dividing their elementary and middle school, creating a new coding class, and implementing PlayWorks to encourage structured play and reduce bullying. They also developed “targeted strategies” to improve English language development instruction and targeted intervention for students struggling academically. Hayward, on the other hand, had no new items for 2017–2018. Many of their forty-five action items were general statements of basic school district activity, such as “continue progress of providing our school sites with added/improved technology infrastructure and devices.” All of the specific programs were carried over from previous years. For example, the “2nd Chance @ College” program that allows parents to take courses at a community college and a biomedical career pathway in their high schools were both started in 2015–2016. Hayward’s focus is on creating “full service community schools,” an initiative they have been developing since 2010. Even though this is a substantive reform, one would expect that there would be commensurate new actions listed on the LCAP to implement it; the fact that there were none suggests that the initiative has stalled. That is not to argue that Hayward is doing a poor job or that they need to make radical changes; the point is that even though it is a much larger district, Hayward is less disposed to adopt new policies than Emery. Collectively, these analyses suggest that smaller districts are more inclined to adopt new policies than their larger cousins. LCAPs from large districts were detailed but often included few new items, focusing more on documenting routine or longstanding practices. This supports the argument that smaller jurisdictions are more nimble, less bureaucratic, and more willing to experiment despite less capacity. Rethinking the Decentralization–Experimentation Link The analysis above supports the conclusion that the LCFF has allowed for some differentiation across districts but has only supported limited and circumscribed experimentation. To conclude, I will examine potential reasons for why it has not lived up to its promise, expanding existing theory regarding the link between decentralization and experimentation. A central problem with previous scholarship on decentralization and experimentation is that it treats the connection between the two as direct and automatic; devolve power and the three causal mechanisms described above will activate. But this relationship is mediated by the preferences, incentives and capacity of local officials (Adams 2016). Decentralization changes the locus of decisionmaking under the assumption that local officials will act differently than those on higher tiers. This is a reasonable assumption since local officials have different resources and operate under varying constraints. However, those differences do not necessarily lead local officials to engage in experimentation. The political and policy context under which local officials operate may act as a spur to experimentation, or it may limit their capacity or willingness to do so. Therefore, to understand why the LCFF has not fulfilled its promise of fostering widespread experimentation we need to explore how the preferences, incentives, and capacity of school district leaders influence their capacity and willingness to experiment. Public opinion regarding education may discourage experimentation on a local level while encouraging it statewide. A longstanding and consistent research finding is that Americans have negative perceptions about the education system in general but more positive views about their local schools. For example, an Education Next survey found that only 23 percent of respondents would give public schools as a whole an “A” or “B” grade, but 54 percent gave those grades to schools in their community (West et al. 2018). For state officials, disapproval of educational performance encourages them to enact reforms, and since they are addressing the education system in the aggregate, the fact that most citizens are happy with their local schools is not much of a barrier; reform is generally desired and seen as a positive development. The calculus for local officials, however, is different. They are not reforming the education system in the aggregate, but specific schools that their constituents generally think are performing well. Because of that, there are limits to the extent and breadth of experiments they are willing to enact. Major changes that upend existing practices may receive a cold reception from parents and community members who are satisfied with the status quo. Because districts officials are experimenting with specific schools (which people generally rate positively) and state officials are doing so with the education system as a whole (which people rate negatively), it is possible that the latter group have greater incentives to pursue experimental policies. School superintendents also have career-related incentives to limit the scope of their experimentation. In general, career advancement is enhanced by a reputation for being an innovator (Hess 1999), and they have incentives towards experiments that can generate résumé items when they apply for a new position. However, there are also incentives to be risk-adverse; failed experiments could be a career killer. These counter-pressures push superintendents towards relatively safe initiatives: a few new action items every year that can be used to pad a résumé but nothing too sweeping. The likelihood that superintendents will enact new policies will vary depending on their tenure, ambition, and district context, but in general the incentives facing the typical superintendent will likely encourage incremental and moderate change. Beyond these incentives towards safe, incremental change, there are two additional factors that may stifle experimentation. Some commentators have suggested that a “compliance mentality” is to blame for the modest levels of experimentation seen since the LCFF was enacted; school officials are focused on following rules created by state and federal officials rather than enacting policies that will improve educational outcomes.10 As many scholars have argued (Schneider and Ingram 1997; Soss 1999; Mettler and Soss 2004), policies send messages to target populations that affects their orientation towards politics and modifies their behavior. Past education policy has send the message to local officials that their role is to follow state and federal mandates and not to experiment, although competing messages and social discourses about education reform may lead local school officials in the other direction. It remains an open empirical question as to whether a compliance mentality exists and what effect it has on educational reform. School district officials may also have incentives to avoid ideologically laden reforms, explaining why conservative districts did not use their powers under the LCFF to push their districts’ policies to the right. This may seem like an odd claim given the frequent and ongoing ideological battles in large urban school districts. But dynamics may be different in mid-sized and small districts. The notion that education policy should be non-ideological still has currency among citizens, despite partisan skirmishes, and school officials may be hesitant to appear to be “politicizing” education policy. Further, even if constituents prefer an ideologically driven policy agenda, school board members may feel moderation is safer given incumbency advantage; opposition is more likely to arise from moving too far to the extreme than being moderate. Further, superintendents may not reflect the ideologies of the communities they serve, and even if they do, they may not want a reputation as an ideologue that could hurt their chances on the job market. In particular, superintendents in small, rural districts may not want to enact conservative policies desired by community residents because it would limit their chances at moving to suburban or urban districts that are more liberal. All of the above reflections are hypotheses that await empirical examination. The main point here is that the causal connection between decentralization and experimentation is mediated by the incentives and preferences of policymakers. There is much more work to be done examining this connection, especially analyzing under what conditions local officials would be more likely to experiment than their state or federal counterparts. Examining how incentives, preferences, and capacity to experiment vary across tiers of government (as well as across policy domains) can illuminate when decentralization will lead to greater levels of experimentation. Specifically in the area of education, research needs to be conducted on the incentives and preferences of school officials regarding experimentation. Further, the finding that smaller districts are more likely to experiment calls out for exploring how size influences the capacity to innovate, in particular how smaller size may lead to greater flexibility that can overcome less technical expertise and fiscal capacity. In general, fostering greater experimentation remains a potential benefit of decentralization but more research needs to be done exploring under what conditions this outcome is likely. Footnotes 1 There are various interpretations among districts as to the rules regarding how supplemental funding and concentration grants could be spent (Humphrey et al. 2017). 2 There was also a guarantee written into the law that no district should receive less funding than they did under the previous law. 3 Along those lines, the LCFF’s stakeholder engagement requirement is designed to provide district officials with a better understanding of student needs and community preferences (Marsh and Hall 2018; Humphrey et al. 2018). 4 The average amount of legislation did not decrease after the LCFF was passed. 5 Districts may have continued with the activities promoted by the grant even if they did not list any specific action items. For example, most districts have a Gifted and Talented Education (GATE) program even though they do not mention it in their LCAP. But mentioning something in the LCAP indicates that it is a priority or considered important by the district, and thus what Table 1 is measuring is not primarily the existence of programs but whether districts prioritize the same activities as the state. 6 This pattern also holds for the 2014–2015 LCAPs. 7 Transforming the independent variables from continuous to ordinal was necessary to limit the number of empty cells, which can bias results in ordered logit regressions (because each case has a unique set of values, two-thirds of the cells are empty when using continuous variables). 8 A similar pattern emerges if we compare the 2014/2015 LCAPs to 2017/2018: the three largest districts in the sample (Long Beach, Santa Ana, and Capistrano) saw minimal change over this period. 9 The name of the district is Emery Unified, which is located in the City of Emeryville. 10 For the perspectives of school districts officials on this issue, see Koppich, Humphrey, and Marsh (2015). For a discussion of how a compliance mentality can undermine innovation, see McShane and Hess (2014, 332). References Adams Brian E. 2016 . Assessing the merits of decentralization: A framework for identifying the causal mechanisms influencing policy outcomes . Politics & Policy 44 ( 5 ): 820 – 849 . Google Scholar Crossref Search ADS WorldCat Berry Frances Stokes , Berry William D. . 1990 . State lottery adoptions as policy innovations: An event history analysis . American Political Science Review 84 ( 2 ): 395 – 415 . Google Scholar Crossref Search ADS WorldCat Berry Frances Stokes , Berry William D. . 2014 . Innovation and diffusion models in policy research. In Theories of the policy process , 3rd ed, ed. Sabatier Paul A. , Weible Christopher , 307 – 359 . Boulder, CO : Westview Press . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Boehmke Frederick J. , Skinner Paul . 2012 . State policy innovativeness revisited . State Politics & Policy Quarterly 12 ( 3 ): 303 – 329 . Google Scholar Crossref Search ADS WorldCat Faguet Jean-Paul. 2012 . Decentralization and popular democracy: Governance from below in Bolivia . Ann Arbor : University of Michigan Press . Google Scholar Crossref Search ADS Google Scholar Google Preview WorldCat COPAC Fuller Bruce , Tobben Laura . 2014 . Local control funding formula in California: How to monitor progress and learn from a grand experiment. The Chief Justice Earl Warren Institute on Law and Social Policy , November 2014. https://partnersforeachandeverychild.org/locations/california/ Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Hanushek Eric A. , Lindseth Alfred A. . 2009 . Schoolhouses, courthouses, and statehouses . Princeton, NJ : Princeton University Press . Google Scholar Crossref Search ADS Google Scholar Google Preview WorldCat COPAC Heilig Vasquez Julian , Ward Derrick R. , Weisman Eric , Cole Heather . 2014 . Community-based school finance and accountability: A new era for local control in education policy? . Urban Education 49 ( 8 ): 871 – 894 . Google Scholar Crossref Search ADS WorldCat Hess Frederick M. 1999 . Spinning wheels: The politics of urban school reform . Washington DC : Brookings Institution Press . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Hooghe Liesbet , Marks Gary . 2003 . Unraveling the central state, but how? Types of multi-level governance . American Political Science Review 97 ( 2 ): 233 – 243 . OpenURL Placeholder Text WorldCat Humphrey Daniel , Julia Koppich, Magaly Lavadenz, Julie Marsh, Jennifer O'Day, David Plank, Laura Stokes, and Michelle Hall. 2017 . Paving the way to equity and coherence? The local control funding formula in year 3. The Local Control Funding Formula Research Collaborative, in association with Policy Analysis for California Education (PACE). https://edpolicyinca.org/projects/lcffrc-overview Humphrey Daniel . 2018 . How stakeholder engagement fuels improvement efforts in three California school districts. The Local Control Funding Formula Research Collaborative, in association with Policy Analysis for California Education (PACE). https://edpolicyinca.org/projects/lcffrc-overview Koppich Julia E. , Humphrey Daniel , Marsh Julie A. . 2015 . Two years of California’s Local Control Funding Formula: Time to reaffirm the grand vision. Policy analysis for California Education (PACE) policy brief 15-2. https://edpolicyinca.org/projects/lcffrc-overview Lindblom Charles E. 1959 . The science of “muddling through” . Public Administration Review 19 ( 2 ): 79 – 88 . Google Scholar Crossref Search ADS WorldCat Lindblom Charles E. 1979 . Still muddling, not yet through . Public Administration Review 39 ( 6 ): 517 – 527 . Google Scholar Crossref Search ADS WorldCat Marsh Julie A. , Hall Michelle . 2018 . Challenges and choices: A multidistrict analysis of statewide mandated democratic engagement . American Educational Research Journal 55 ( 2 ): 243 – 286 . Google Scholar Crossref Search ADS WorldCat McShane Michael Q. , Hess Frederick M. . 2014 . The politics of entrepreneurship and innovation. In Handbook of education politics and policy , ed. Fusarelli Lance D. , Cibulka James G. , Cooper Bruce S. , 324 – 341 . New York : Routledge . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Mettler Suzanne , Soss Joe . 2004 . The consequences of public policy for democratic citizenship: Bridging policy studies and mass politics . Perspectives on Politics 2 ( 1 ): 55 – 73 . Google Scholar Crossref Search ADS WorldCat Mintrom Michael. 1997 . Policy entrepreneurs and the diffusion of innovation . American Journal of Political Science 41 ( 3 ): 738 – 770 . Google Scholar Crossref Search ADS WorldCat Oates Wallace E. 1972 . Fiscal federalism . New York : Harcourt Brace . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Oates Wallace E. 1999 . An essay on fiscal federalism . Journal of Economic Literature 37 ( 3 ): 1120 – 1149 . Google Scholar Crossref Search ADS WorldCat Schneider Anne Larason , Ingram Helen . 1997 . Policy design for democracy . Lawrence : University Press of Kansas [Database ] Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Shipan Charles R. , Volden Craig . 2008 . The mechanisms of policy diffusion . American Journal of Political Science 52 ( 4 ): 840 – 857 . Google Scholar Crossref Search ADS WorldCat Soss Joe. 1999 . Lessons of welfare: Policy design, political learning, and political action . American Political Science Review 93 ( 2 ): 363 – 380 . Google Scholar Crossref Search ADS WorldCat Treisman Daniel. 2007 . The architecture of government: Rethinking political decentralization . New York : Cambridge University Press . Google Scholar Crossref Search ADS Google Scholar Google Preview WorldCat COPAC Walker Jack L. 1969 . The diffusion of innovations among the American states . American Political Science Review 63 ( 3 ): 880 – 899 . Google Scholar Crossref Search ADS WorldCat West Martin R. , Henderson Michael B. , Peterson Paul E. , Barrows Samuel . 2018 . The 2017 EdNext Poll on school reform: Public thinking on school choice, common core, higher ed, and more . Education Next 18 ( 1 ): 32 – 52 . OpenURL Placeholder Text WorldCat Appendix Table A1. Descriptive statistics Variable . Mean . High . Low . District enrollment 14,244 74,681 15 Free/reduced lunch (%) 57.4 89.8 1.0 Unduplicated students (%) 61.4 94.0 2.6 Republican (%) 29.2 51.1 4.4 No. of action items per LCAP 35.5 92 5 Experiment scores 6.4 37 0 Variable . Mean . High . Low . District enrollment 14,244 74,681 15 Free/reduced lunch (%) 57.4 89.8 1.0 Unduplicated students (%) 61.4 94.0 2.6 Republican (%) 29.2 51.1 4.4 No. of action items per LCAP 35.5 92 5 Experiment scores 6.4 37 0 Open in new tab Table A1. Descriptive statistics Variable . Mean . High . Low . District enrollment 14,244 74,681 15 Free/reduced lunch (%) 57.4 89.8 1.0 Unduplicated students (%) 61.4 94.0 2.6 Republican (%) 29.2 51.1 4.4 No. of action items per LCAP 35.5 92 5 Experiment scores 6.4 37 0 Variable . Mean . High . Low . District enrollment 14,244 74,681 15 Free/reduced lunch (%) 57.4 89.8 1.0 Unduplicated students (%) 61.4 94.0 2.6 Republican (%) 29.2 51.1 4.4 No. of action items per LCAP 35.5 92 5 Experiment scores 6.4 37 0 Open in new tab Table A2. Examples of action item coding District . New action item . Code . Reason for codinga . Alpine Valley Unified “For the purpose of ensuring unduplicated students receive focused instruction in their grade level curriculum: Maintain teaching staff levels to ensure straight grades for English Language Arts and Mathematics.” 0 Contains status quo verb “maintain” and does not have a change verb Banning City “There is a need for administrative and teacher support with bilingual parent outreach. The BUSD will continue to employ a Bilingual Parent Outreach consultant. (Implemented since 2014-2015)” 0 Contains status quo verb “continue” and does not have a change verb Surprise Valley Joint Unified “Continue to utilize Catapult Emergency System and maintain Radio Repeater System.” 0 Contains status quo verb “continue” and does not have a change verb Davis Joint Unified “Support the evaluation of course access in the areas of math and science, grades 8-12” 1 Does not contain a status quo or change verb Silver Valley Unified “Administer annual technology survey to all SVUSD staff” 1 Does not contain a status quo or change verb Walnut Valley Monitor student attendance, drop-out rates, and transfer rates through a review of data and documentation. 1 Does not contain a status quo or change verb Owens Valley Unified “Offer an after school Homework Club for students to get tutoring and help with homework” 2 Contains change verb “offer” ABC Unified “Purchase a new work order system and provide staff with technical support to improve workflow efficiency.” 2 Contains change verb “purchase” Big Pine Unified “Create local assessment system using NWEA MAP,AIMSweb and District Writing benchmark rubric” 2 Contains change verb “create” District . New action item . Code . Reason for codinga . Alpine Valley Unified “For the purpose of ensuring unduplicated students receive focused instruction in their grade level curriculum: Maintain teaching staff levels to ensure straight grades for English Language Arts and Mathematics.” 0 Contains status quo verb “maintain” and does not have a change verb Banning City “There is a need for administrative and teacher support with bilingual parent outreach. The BUSD will continue to employ a Bilingual Parent Outreach consultant. (Implemented since 2014-2015)” 0 Contains status quo verb “continue” and does not have a change verb Surprise Valley Joint Unified “Continue to utilize Catapult Emergency System and maintain Radio Repeater System.” 0 Contains status quo verb “continue” and does not have a change verb Davis Joint Unified “Support the evaluation of course access in the areas of math and science, grades 8-12” 1 Does not contain a status quo or change verb Silver Valley Unified “Administer annual technology survey to all SVUSD staff” 1 Does not contain a status quo or change verb Walnut Valley Monitor student attendance, drop-out rates, and transfer rates through a review of data and documentation. 1 Does not contain a status quo or change verb Owens Valley Unified “Offer an after school Homework Club for students to get tutoring and help with homework” 2 Contains change verb “offer” ABC Unified “Purchase a new work order system and provide staff with technical support to improve workflow efficiency.” 2 Contains change verb “purchase” Big Pine Unified “Create local assessment system using NWEA MAP,AIMSweb and District Writing benchmark rubric” 2 Contains change verb “create” Note: All of the action items in this table were identified as “new” on the LCAPs. a There are two status quo verbs: continue and maintain. There are twenty-four change verbs: Change, hire, recruit, staff, fund, purchase, implement, enhance, upgrade, strengthen, increase, invest, offer, modernize, improve, refine, strengthen, provide, add, develop, create, adopt, establish, and institute. Open in new tab Table A2. Examples of action item coding District . New action item . Code . Reason for codinga . Alpine Valley Unified “For the purpose of ensuring unduplicated students receive focused instruction in their grade level curriculum: Maintain teaching staff levels to ensure straight grades for English Language Arts and Mathematics.” 0 Contains status quo verb “maintain” and does not have a change verb Banning City “There is a need for administrative and teacher support with bilingual parent outreach. The BUSD will continue to employ a Bilingual Parent Outreach consultant. (Implemented since 2014-2015)” 0 Contains status quo verb “continue” and does not have a change verb Surprise Valley Joint Unified “Continue to utilize Catapult Emergency System and maintain Radio Repeater System.” 0 Contains status quo verb “continue” and does not have a change verb Davis Joint Unified “Support the evaluation of course access in the areas of math and science, grades 8-12” 1 Does not contain a status quo or change verb Silver Valley Unified “Administer annual technology survey to all SVUSD staff” 1 Does not contain a status quo or change verb Walnut Valley Monitor student attendance, drop-out rates, and transfer rates through a review of data and documentation. 1 Does not contain a status quo or change verb Owens Valley Unified “Offer an after school Homework Club for students to get tutoring and help with homework” 2 Contains change verb “offer” ABC Unified “Purchase a new work order system and provide staff with technical support to improve workflow efficiency.” 2 Contains change verb “purchase” Big Pine Unified “Create local assessment system using NWEA MAP,AIMSweb and District Writing benchmark rubric” 2 Contains change verb “create” District . New action item . Code . Reason for codinga . Alpine Valley Unified “For the purpose of ensuring unduplicated students receive focused instruction in their grade level curriculum: Maintain teaching staff levels to ensure straight grades for English Language Arts and Mathematics.” 0 Contains status quo verb “maintain” and does not have a change verb Banning City “There is a need for administrative and teacher support with bilingual parent outreach. The BUSD will continue to employ a Bilingual Parent Outreach consultant. (Implemented since 2014-2015)” 0 Contains status quo verb “continue” and does not have a change verb Surprise Valley Joint Unified “Continue to utilize Catapult Emergency System and maintain Radio Repeater System.” 0 Contains status quo verb “continue” and does not have a change verb Davis Joint Unified “Support the evaluation of course access in the areas of math and science, grades 8-12” 1 Does not contain a status quo or change verb Silver Valley Unified “Administer annual technology survey to all SVUSD staff” 1 Does not contain a status quo or change verb Walnut Valley Monitor student attendance, drop-out rates, and transfer rates through a review of data and documentation. 1 Does not contain a status quo or change verb Owens Valley Unified “Offer an after school Homework Club for students to get tutoring and help with homework” 2 Contains change verb “offer” ABC Unified “Purchase a new work order system and provide staff with technical support to improve workflow efficiency.” 2 Contains change verb “purchase” Big Pine Unified “Create local assessment system using NWEA MAP,AIMSweb and District Writing benchmark rubric” 2 Contains change verb “create” Note: All of the action items in this table were identified as “new” on the LCAPs. a There are two status quo verbs: continue and maintain. There are twenty-four change verbs: Change, hire, recruit, staff, fund, purchase, implement, enhance, upgrade, strengthen, increase, invest, offer, modernize, improve, refine, strengthen, provide, add, develop, create, adopt, establish, and institute. Open in new tab © The Author(s) 2019. Published by Oxford University Press on behalf of CSF Associates: Publius, Inc. All rights reserved. For permissions, please email: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) TI - Decentralization and Policy Experimentation in Education: the Consequences of Enhancing Local Autonomy in California JF - Publius: The Journal of Federalism DO - 10.1093/publius/pjz006 DA - 2020-01-01 UR - https://www.deepdyve.com/lp/oxford-university-press/decentralization-and-policy-experimentation-in-education-the-3fd6is9e7W SP - 30 VL - 50 IS - 1 DP - DeepDyve ER -