Thursday, December 14, 2017
Ask the Administrator: Am I Ready?
A new correspondent writes:
I have been working at a medium-sized, multi-campus community college district in flyover country for a decade. After years as a full-time faculty member, attaining tenure in a social science discipline and performing years of service in our faculty senate, curriculum development, various committees (even chairing a significant committee), I developed interest in coming up through the faculty ranks and going into academic administration. My goal was to aim for a Dean of Instruction position after two three-year terms as a department chair, and I’ve received a lot of encouragement from both faculty and administration. To make sure it was the right fit for me, I just completed my first year as a multi-discipline department chair. I’ve been handling budgets, scheduling, faculty and student complaints, hiring and evaluations, shared governance, faculty leadership, and teaching a much-reduced load. I’ve taken advantage of professional development funds to pursue This kind of work really appeals to the way my mind works, as much as I enjoy the classroom, and it confirmed that my choice to make the jump into pseudo-administration wasn’t completely incompatible with my personality, skills, and values.
So, due to retirements and restructuring, several Dean of Instruction positions are opening up and I’ve gotten many encouragements to apply. My conventional wisdom says to wait until I’ve completed at least one three-year Chair term. I feel that I need to “prove myself” more and put in the time. I’m aware that this is a very gendered opinion and that while women are often promoted on their merits and experience, men are often promoted more on their potential.
Any advice on whether I should apply for a Dean of Instruction position when it seems to be too early in my career? Any other advice for faculty interested in going over “to the dark side” of administration?
I’ve mentioned before a key moment in grad school. I was angst-ing about whether my dissertation was really finished, or it needed still more revision. The following exchange with my roommate clarified matters:
RM: How many chapters do you have?
RM: How many do you need?
Me: Well, five…
RM: Turn it in. Make them tell you what’s wrong with it.
He was right. I turned it in, and my advisor immediately set up a defense. It isn’t perfect, by any stretch, but it has the undeniable virtue of being done.
The same is true of job applications. Postings will usually specify “minimum” and “preferred” qualifications. “Minimum” is supposed to mean that you won’t even be considered if you don’t have it; “preferred” means it might help, but it isn’t required. If there’s a job you want and you meet the “minimum” requirements, I don’t see any reason not to apply. Make them tell you what’s wrong with it.
Admittedly, this strategy involves risking rejection. But if you don’t have a reasonably thick skin, you really shouldn’t go into administration. Just treat each interview as an opportunity to learn about another college and get better at interviewing. If an offer materializes, great.
In my experience both as a dean and supervising deans, I’ve never found age or experience beyond a certain minimum to be terribly predictive of performance. Other qualities matter much more. Contexts vary, but generally, the best deans have excellent communication skills, poise under pressure, a dedication to the mission of the college, and a strong academic sense. More time in a department chair role may help a little with developing a sense of which conflicts to escalate and which to defuse, but you may very well already have it. And fire in the belly goes a long way.
Rather than scrutinizing yourself, I’d recommend finding out what you can about the places you might apply and the realities of the jobs. Self-awareness, as opposed to self-scrutiny, can help you determine which ones are likely to be good fits. Then, with the jobs that appeal to you, take a shot. Make them tell you what’s wrong.
Wise and worldly readers, what do you think? Should she start applying, or should she wait until a second term as chair?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Wednesday, December 13, 2017
A December Evergreen
‘Tis the season for evergreens, so I thought I’d bring out this one. It’s from last year, but with a few details changed, it’s just as true today. To update it, just replace every reference to “ten percent” with this year’s figure of “thirteen percent.”
Brookdale recently ratified a contract with its faculty union, after a bit of a bumpy ride. I was on the management negotiating team, so I had a front-row seat for most of the process.
I can’t disclose anything confidential, but I don’t need to. Here’s what it boiled down to:
Union: Health insurance is eating our raises!
Mgmt: Health insurance is eating our budget!
Insurance Company (in the corner): Nom nom nom nom (burp) nom nom nom (chair collapses) nom nom nom
The bulk of the conflict was over how to divide the rapid increases in the cost of health insurance. The rest of it was relatively straightforward.
I suspect we’re not alone in this.
The catastrophic cost -- and rate of increase -- of health insurance is the 800 pound gorilla of higher ed finance. It’s the primary driver behind adjunctification. It’s increasing faster than any of our revenue sources, and it seems to be picking up steam. In negotiation sessions, it’s the sun around which every other issue orbits.
(For those keeping score at home, that makes it a nuclear fusion powered 800 pound gorilla that knows how to drive a steam-powered car, and anchors a series of satellites. Scary stuff.)
To make it concrete, we have three major sources of operating funds: the state, the county, and students. State and county funding have been flat for years, and enrollment is dropping. Meanwhile, the cost of health insurance goes up by at least ten percent per year. Do the math, projecting out a few years. It’s not pretty.
Labor negotiations are difficult because one of the parties -- the one getting the best deal -- isn’t at the table. It just jacks up prices, and the rest of us pay them. Internal disputes are over how much of each year’s cost jump is borne by whom. Nobody internal comes out ahead.
Of course, over the long term, unsustainable trends aren’t sustained. This one clearly can’t be.
Our health insurance system, if you want to call it that, was an accident of history. It emerged in its present form as an end run around wage and price controls during World War II. With pay levels frozen, companies that wanted to recruit workers had to find other enticements, so they developed packages of benefits. By the time President Truman (!) got around to proposing national health insurance, the AMA was able to argue that it was largely unnecessary. Add some red-baiting (“socialized medicine!”) and the racial politics of the New Deal coalition, and the end run became the new normal by default.
That’s why literally no other advanced country has anything like it.
Postwar prosperity made the system tenable long enough for it to start to seem natural, but it never really made sense. Now we’re seeing the flaws in the system get so large that they start to deform or consume other sectors of the economy. Prospective entrepreneurs don’t start companies because they can’t afford to pay for their own health insurance. Employers everywhere pay careful attention to maximum hours for part-time status, because the marginal cost of going over is prohibitive. If you don’t believe me, ask your HR office what the monthly premium for COBRA is.
Locally, we managed to piece together a deal that puts off the day of reckoning for a few more years. I’m glad we did -- really, really glad we did -- but the basic underlying trendlines are still there. That’s not something we can solve locally. That requires a national solution. Absent that, I foresee the rides getting bumpier and bumpier until something breaks.
Bumpier and bumpier? Check. January’s increase will cost the college a million dollars. That’s from a total college operating budget of about 78 million. And that’s just health insurance, just the increase, and in just one year. In a time of declining enrollment.
Trees, even evergreens, don’t grow to the sky. Here’s hoping this one stops before it blocks the sunlight for everything else.
Tuesday, December 12, 2017
Blind Spots and “Beer Money”
Several years ago I was part of a conversation with a professor about the college’s budget. Predictably, the budget was tighter than any of us would have liked. He put forth an impressive wish list, which he suggested funding with a double-digit tuition increase. When I balked at the size of the increase, he responded that “it’s just beer money” and the students could easily afford it.
He didn’t get the increase he wanted, but the comment stuck with me.
Based on what I know of him, I don’t think he was malicious. He actually thought he was right. And there are some students for whom he was right. But for the ones about whom he was wrong, he was terribly wrong. The blind spot could have had serious consequences.
I was reminded of that in reading the latest paper from Katharine Broton and Sara Goldrick-Rab, “Going Without.” It’s based on multiple surveys of tens of thousands of students at two year and four year colleges. Among its findings are that at least a third of community college students are “housing insecure,” including 14 percent who are homeless.
That’s bad enough, but it gets really shocking when Broton and Goldrick-Rab disaggregate the numbers. On Table 5, they indicate that over 57 percent of the students coded as “homeless” are working, and they’re working an average of just over 31 hours per week.
That was a jaw-dropper for me. Students who are working for pay over 30 hours a week are homeless. If you take the unemployment rate as your primary economic indicator, you’ll miss that. Anyone who’s working 30 hours per week and going to college can’t be accused of slacking. That’s not the core issue. The core is a pincer movement in which wages have stayed low while housing costs, textbook costs, and, yes, tuition, have increased.
Among the four surveys they examined, anywhere from 21 to over 36 percent answered “yes” to “Were you ever hungry but did not eat because there was not enough money for food?”
This isn’t a matter of beer money. It’s a matter of basic survival.
Yet the gaps we force students to straddle largely fall into the blind spots of policymakers. Part of that is because of the average age of policymakers, as opposed to students. Part of it is that most policymakers attended four-year colleges, often well-funded ones, and had no direct experience of serious hunger or finding a place to sleep at night. In their own experience, “it’s just beer money” may have been true. Broton and Goldrick-Rab note that hunger and homelessness are far more common at community colleges than at four-year colleges. (What that says about the appropriateness of measuring the performance of the two sectors by the same metrics, I’ll leave as an exercise for the reader.)
One quibble: Broton and Goldrick-Rab claim, in passing, that “[c]ommunity colleges rarely provide on-campus housing…” That’s less true than it used to be; according to Walter Bumphus’ talk at Middle States last week, nationally, 27% of community colleges offer on-campus housing. The number is higher than I expected, and climbing. It varies dramatically by state and region; for example, it’s commonplace in New York but nearly absent in New Jersey. In places where it exists, it may provide one option for housing-insecure students. But even allowing for a growth of on-campus housing in this sector, the numbers of students struggling for a safe place to sleep are staggering.
Sustainable solutions will probably require concerted efforts across sectors. I’ve read about some community colleges designating people on campus whose job it is to connect students in need to available social services in the area, which strikes me as a fantastic idea. Emergency funds, often from college foundations, can make the difference between a student staying housed and getting evicted, or between staying in an abusive situation for the sake of shelter and finding someplace safe. Even something as straightforward as Open Educational Resources (OER) instead of expensive textbooks can free up money for food, shelter, and transportation. That’s well within our power, as a sector. If the same Pell grant has more left over because a school has moved to OER and the student doesn’t have to buy books, those dollars become available for basic needs.
Larger changes require politics. Disinvestment in public higher education is a political choice. Upward distribution of wealth is a political choice. So are exclusionary zoning, financial aid rules based on the assumption that a student is 18 and full-time, and the minimum wage. No college or set of colleges can undo all of that alone.
But a tip of the cap to Broton and Goldrick-Rab for doing the shoe-leather epidemiology that may help us actually start to get a handle on this. No student who works 30 hours a week and takes college classes should be homeless. If we could stop thinking about aid as subsidized beer money, we might actually make some headway.
Monday, December 11, 2017
In Memory of Forgetting
Remember when it was possible to forget?
Last Friday I was able to attend a panel discussion of several Brookdale faculty, organized by the students of the Phi Theta Kappa chapter here. The topic was supposed to be about shifting standards of beauty over time, but the discussion had a mind of its own, and it quickly became examining the stress caused for today’s students by trying to live “curated” lives on Instagram and Snapchat.
I had never seen as pronounced a generation gap as I did there. The students, mostly of traditional age or close to it -- honestly, the older I get, the harder it is to tell -- told the faculty (and other older folks in the audience, such as myself) about how they live their lives around social media. One young woman on the panel mentioned as an example that before going to a concert, she picks an outfit that she thinks will look great on Instagram, and spends the first half-hour or so at the venue setting up her shot. The point of the concert is the photo. A professor on the panel responded that when she was younger, she’d choose outfits for concerts, too, but the point of the outfit was to pick up guys. Now, the record of the moment becomes more important than the moment itself.
Looking back on my teen years, it’s only by the grace of God that most of my worst cringe-inducing moments have vanished down the collective memory hole. I remember most of them, but there’s no documentary record. They never went viral, and some of them could have. Today’s young people have all of the same pressures around self-image and dating that we did, plus a new layer of public recording that we didn’t.
As useful as social media can be, there’s also something to be said for forgetting as an act of mercy. Those early years are a time of trying on different identities, ideas, styles, and ways of being in the world. Some of them work, and stick, and some of them don’t. Keeping the clunkers around forever could inhibit the trial-and-error that allows real growth.
I’m thinking here of Richard Sennett’s classic “The Fall of Public Man,” in which he argued that the gradual replacement of social “roles” with the idea of social authenticity was actually inhibiting, because in the new regime, flaws or missteps are taken to reveal something broken about the actual person. Hearing 19 year olds now worry that pictures they post of themselves being exuberant may be used against them later when they apply for jobs or run for office struck me as the reductio ad absurdum of Sennett’s argument. The curated self is very much a role, a performance, but we don’t read it as one. We take it literally, and form judgments about the person based on the persona.
As the discussion went on, I wished it had addressed the shift at the turn of the millennium from “mass” media to “social” media. I’m old enough to remember when television had four channels: NBC, CBS, ABC, and PBS. When the means of cultural production were few and tightly centralized, most of us encountered them only as consumers. Now that most young people are producing culture daily through social media, they’re living the pressures of producers, as opposed to consumers. And they’re doing it before they’ve had a chance to grow into themselves as adults.
In many ways, that’s terrific. Young people who don’t fit in to the dominant culture where they live can find kindred spirits online, drawing hope and strength from discovering that they’re not alone. They can share artistic breakthroughs in real time, rather than having to find and charm some skeevy network or music company executive.
But they miss what we had and didn’t know to appreciate: the luxury of having your awkward phases forgotten. They don’t get to rehearse before taking themselves public.
I’ve mentioned before a temperamental allergy to arguments for turning time backward, so I won’t try to argue for that. How do you keep them on the farm once they’ve seen Instagram? Instead, I suspect the way around this is through it. While many, many more of us have seized the opportunity to produce, we’re also all still consumers. If we gradually start to look at what we’re consuming through the eyes of producers, we may start to realize just how ridiculous it would be to punish some future rising star for an unguarded moment she posted at 19. We may not be able to forget anymore, but we can still choose to forgive.
Mercy can come from the fates, but we can produce it ourselves, too. As long as we’re producing everything else, a little mercy might not be a bad idea. My generation and those before it were granted the accidental mercy of a lifetime of do-overs. Today’s students deserve no less.
Sunday, December 10, 2017
Missed Opportunities: Middle States, Day 3
The last day of the Middle States conference finally got around to a discussion of issues relevant to community colleges. Walter Bumphus, the President of the AACC, gave the Friday keynote.
It was a kind of survey of issues, not going terribly deep on any of them. It was lively and entertaining, but I came away disappointed that it could have been so much more.
He opened with a reference to Elizabeth Warren’s description of regional accreditors, in the wake of the meltdown of the for-profits, as “the watchdog that didn’t bite.” I was hoping he’d go somewhere with that, but just cracked that “the last thing we need is the Feds in accreditation” before moving on.
That was a missed opportunity. The day before, Peter McPherson mentioned (correctly) that part of the problem with accreditation as a tool for quality control is that it’s binary: either you’re accredited or you aren’t. That may be okay for a startup, but for a mature institution, it tends to lead to a certain skepticism. The City College of San Francisco wasn’t allowed to fail, because it’s too large and important to the city. Loss of accreditation is a sort of nuclear option, but every existing option short of that relies on the believability of the nuclear option. If nobody seriously believes that an accreditor will go nuclear, the intervening threats lack a certain bite.
McPherson proposed instead that sanctions could come in parts or stages. When I asked for an example of what that might look like, he came up with caps on financial aid availability. For instance, a college might become eligible for only 90 percent of the previous year’s Title IV allocation. I don’t think that particular method makes sense -- student loan eligibility follows the student, and some schools have dropped out of the program altogether -- but the concept makes sense. If there’s something between “double secret probation” and complete shutdown, the accreditors might become more willing to act, and colleges would be forced to take the threat of oversight more seriously. A less gun-shy accreditor might be able to head off disaster earlier.
Bumphus went on to give an overview of the contours of the debate around the reauthorization of the Higher Education Act, as well as the politics of higher ed in the Trump years. He fired off some good lines -- “If you aren’t at the table, you’re on the menu” -- but otherwise covered well-worn territory. I was heartened to hear a distinction between “registered” apprenticeships and “recognized” apprenticeships, but otherwise it was largely about enrollment and funding. (To be fair, he did spend some time on DACA, which is a very real issue in this sector.) He took the existing business model for granted.
That’s understandable, but again, a missed opportunity. Part of the widespread presidential turnover throughout the sector that he mentioned -- he cited 250 presidencies turning over per year, out of just over 1100 community colleges nationally -- comes from a failure to come to grips with the need to change the underlying business model. Growth forgives many sins, so the flaws in the model could be tolerated as long as there was a demographic and/or political tailwind. But with demographics in much of the country working against community colleges, and with increasing hostility from various parts of government, the gaps in the model are becoming apparent. That puts presidents in a tough spot, since the short-term political cost of structural change is often higher than simply coasting downwards. And many boards simply don’t understand the issues at hand well enough to distinguish necessary conflict from unnecessary, or what’s under a president’s control from what isn’t. They wind up blaming presidents for demographic shifts, or expecting miraculous change without anyone getting upset. Nobody can live up to that.
Bumpus mentioned a new “onboarding” program the AACC is offering new presidents, and that’s fine. But if you don’t address boards and business models, you’re setting the newbies up to fail. Yes, we’re due for a recession, and that may provide a short-term enrollment boost. But unless we address the longer term issues underlying the sector’s struggles, we’re asking presidents to be superhuman. Nobody is superhuman.
Still, I was gratified to see community colleges take center stage. Now we need to stop pretending that the challenges are entirely short-term, and start digging deeper.
Thursday, December 07, 2017
What Gets Said, and What Goes Unsaid
I’m in Philadelphia at the annual conference of the MIddle States Commission on Higher Education, which is the regional accreditor for the mid-Atlantic states and Puerto Rico.
It’s different from the League or the AACC conference in that it doesn’t focus particularly on community colleges. It covers everything from community colleges to research universities, which means I sometimes get to hear presenters from other sectors of higher education. And I can’t help but notice certain patterns.
The Thursday morning keynote was by Peter McPherson, the president of the Association of Public and Land-Grant Universities. He addressed accountability in higher ed from the perspective of the group he represents. Which is to say, it was odd.
He rightly called attention to the relative paucity of low-income students at some of the more prestigious universities, and also noted correctly that much of the issue around student loan defaults is really about dropouts. By his statistics -- and I didn’t catch the source or the frame of reference -- the default rate for student loans among college dropouts is 24%, as opposed to 9% for graduates. So far, so good.
But it was largely downhill from there.
His solution to the problem of elite campuses getting ever more elite was...drum roll...a Gates-funded program to increase the number of low-income students there.
Nothing inherently wrong with that -- it’s a good thing, as far as it goes -- but it’s a boutique solution. Those of us in the trenches on student success issues know that “boutique” is a dirty word. If you want to make a significant and lasting change, you have to get at structure. In this case, that would mean making it easier for community college students -- a much more demographically diverse group -- to carry their credits with them when they transfer.
That would make a sustainable difference over the long term. It would ratify community colleges as on-ramps to the higher echelons of higher education, as they were intended to be. It wouldn’t even require grant money to keep going. But it would involve political battles, both among institutions and within them.
If he mentioned community colleges at all, I missed it.
To add insult to injury, he also endorsed “risk-adjusted assessment,” which would let elite institutions off the hook for much of what we have to do. It’s annoying enough at that level -- Harvard gets a free ride while we have unfunded mandates -- but it would also enable the unfunded mandates to increase unchecked over time, since the elites who largely set the expectations wouldn’t have to meet them themselves. It would be akin to Congress exempting itself from civil rights and sexual harassment rules. How did that work out?
McPherson was speaking as a representative of research universities; I get that. His suggestions made sense from that perspective. But higher education isn’t just research universities. Any serious discussion of class polarization in higher ed has to include community colleges. Any serious discussion of accreditation or assessment has to recognize that putting the most expensive protocols on the least-well-funded institutions isn’t likely to lead anywhere good.
The whole point of getting the sectors together, to my mind, is to enable a broader look at the entire higher ed ecosystem. That involves acknowledging each component of the ecosystem. Should the Ivies and state flagships have more students from the bottom half of the income distribution? Absolutely! Wherever might they find talented students from the bottom half who have shown the capacity to excel at college-level work?
Tomorrow Walter Bumphus, from the AACC, will keynote. Here’s hoping for a needed corrective...
Wednesday, December 06, 2017
DeVry is being handed over, for no money, to a for-profit college that has 600 students. Its current parent company isn’t even getting a player to be named later.
I’m trying to figure this one out.
Several recent moves in the for-profit industry have left me agape, but this one is striking for the sheer incongruity of size. An international chain being taken over, for no money, by a college the size of my high school graduating class?
As longtime readers know, I used to work at a DeVry campus. It was my first “real” job out of graduate school. DeVry was in rapid-expansion mode at the time -- it was the late 90’s tech boom -- and the nonprofits had already made the move to adjuncts. I could adjunct at Rutgers or be full-time at DeVry; I chose the latter, because rent doesn’t pay itself.
This may sound like rose-colored glasses now, but I swear it’s true: for a while there, DeVry was actually trying to gain a sort of academic legitimacy. I caught the tail end of that and the beginning of the decline. When the decline started to accelerate and the “for-profit” part started to drown out the “education” part, I bolted for the community college world. That was over fourteen years ago.
Still, it’s hard to imagine the place being swallowed by a college of 600 students. When I worked at the North Brunswick campus, it had 4,000 students by itself. And it was one of two dozen campuses around the country.
DeVry’s decline was largely self-inflicted, mostly because when it faced a choice between improving quality and improving quantity, it chose the latter. You can only hollow out quality for so long before something terrible happens.
I’ve been trying to suss out the motivations on both sides for the “sale.” (“Handoff” comes closer to the truth.) As near as I can figure, the “buyer” stands to gain scale quickly without paying for it. The last decade has shown pretty clearly that tuition-driven small colleges will struggle. It’s easier to change the “small” part than the “tuition-driven” part. And the “seller” stands to offload a whole bunch of potential legal judgments against it. The “buyer” may be gambling that its very lack of assets will deter lawsuits, just because there’s nothing to win; the “seller” has probably determined that it’s better off just washing its hands of DeVry and walking away.
I can’t really see the “buyer” doing well long-term, unless it knows something substantial that I don’t. The organization had been so cost-conscious for so long that I don’t see a lot of efficiency gains to be had. And its course offerings aren’t particularly unique. You can get a business or CIS degree in a lot of places, and often less expensively. Some of what was once innovative about it has long since become common practice elsewhere. In a competitive industry, I’d be hard-pressed to name what would make it special.
Of course, as I write this, Congress is looking at changing the Higher Education Act in ways that will make for-profit college’s lives easier and public colleges’ lives harder, so the “buyer” may be taking a calculated risk that it can wait until the good times for for-profits come back. If the buyer is struggling to survive anyway, I could see the logic behind a “what the hell” move.
Wise and worldly readers, is there a logic to this move that I’m missing? At best, this looks like a “cut your losses” move by the “seller,” and a Hail Mary pass by the “buyer.” Is there a better reading?
Tuesday, December 05, 2017
Green Eggs and Ham, but for Grownups
We used to feed our dog a particular brand of food. She ate it for years without complaint, though, to be fair, I’m not sure what a complaint would have looked like. At one point, though, her brand was recalled for some horrible mishap at the plant, so we had to get her a different brand. She absolutely wolfed down the new stuff. When I mentioned it to someone who cares for dogs, she mentioned that every time there’s a recall, owners report that the dogs like the new food better. It may or may not be that they like it better, she said, but they like the break in the monotony. Owners just don’t think to deviate from a good enough solution until they’re forced to, at which point it seems retrospectively obvious.
The Times had a good piece a few days ago about something similar on the London Underground. When some routes were blocked off for construction and riders had to find other ways to get to and from work, a non-trivial number of them found ways that were faster than their habitual ones. They stuck with the new routes even after the old routes reopened. They had stuck with relatively inefficient routes out of habit, until the habit was forcibly broken.
Anyone familiar with Dr. Seuss’ oeuvre will recognize these as variations on the story of green eggs and ham. The unnamed hero wouldn’t eat green eggs and ham until Sam I Am wore him down, at which point he discovered that he liked them. The earlier refusal was revealed as little more than prejudice, probably based on habit.
The unnamed hero, the London commuters, and so many dog owners fall into the same trap. They (we) are a little too quick to discount or dismiss alternative possibilities. A good enough solution seems good enough to not bother looking for an even better one.
And there are times when that makes perfect sense. We’ve all had the friend who could never make even a simple decision. In deciding where to get dinner, sometimes almost any decision is better than no decision. Incessantly trying to maximize everything in life would be exhausting and self-defeating. Sanity requires picking battles. Perfectionism can prevent actually getting anything done.
But in academia, in particular, I’ve noticed that we’re sometimes a little too quick to dismiss possibilities. We’re a little too close to Sam I Am, leaving perfectly good green eggs and ham uneaten because they don’t match our inherited notions of what good food looks like.
(For that matter, I’ve long wondered why there’s no blue food.* But that’s another post.)
On my campus, we’ve down Open Houses in the fall and spring for years. They had always been in a “trade show” format in the arena, with various programs in booths next to each other. Last Fall, for unrelated reasons, that became impossible. So we switched to using the entire campus. This year, given the option, we stuck with the entire campus. It’s a pretty campus with some terrific facilities, so why not show it off? But it took the force of circumstance to compel the initial change. If not for that, we’d probably still be in the arena.
Sam I Am enabled a breakthrough because he didn’t stop. There’s a lesson in there...
*Blueberries are more purple than blue. “Blue” cheese is mostly white. The mystery remains.
Sunday, December 03, 2017
The View from Here
I don’t usually do overtly political posts. Part of that is because I’m not convinced that they would help, and part of it is that I need to be able to make common cause with people whose politics are different from my own in order to help my college, and other community colleges, thrive.
That said, the combined impact of many of the provisions of the two tax bills floating around Congress now could be backbreaking for the entire sector. It’s as if they were written out of spite. I feel ethically obligated to offer the view from here.
A few specifics:
Disallowing the deduction for state and local taxes would drive a stake through the heart of our appropriations. State and local government budgets don’t have a lot of discretionary spending in them, once you take care of law enforcement, K-12, and Medicare. Taking away the federal tax deduction for money that people have already paid in taxes to states, counties, or towns would amount to a drastic tax increase on them. (One version of the bill even disallows the deduction for property taxes. New Jersey has the highest property taxes in the country. It would be devastating here.) We’ve been dealing with austerity for years, but this would take cutting to a new level, and abruptly.
Reducing the impact of deductions for charitable contributions would make it much harder to maintain, let alone increase, our philanthropic fundraising. Coming at the same time as a direct attack on our appropriations, this is doubly disturbing. Community colleges have been late to the game in terms of fundraising, but in the aftermath of the recession, they’ve been getting somewhat better. This is a kick in the teeth.
Repealing the individual mandate for health insurance will inevitably drive up its costs, and it’s climbing at an unsustainable rate now. At my own college, for instance, “family” coverage for an employee costs about $30,000 per year (split between the employee and the college), and it’s set to increase 13 percent next year. It amounts to a viciously regressive tax on employment, with predictable effects on full-time hiring. The only thing keeping it from going up even faster is the presence of relatively low-cost young people in the insurance pool. They contribute more than they use. Allow them to opt out, and the cost for those remaining will increase even faster. I’ve seen that called “adverse selection” or a “death spiral.” Whatever you call it, it’s unwelcome.
One of the few perks we’ve been able to afford to offer employees is tuition waivers for themselves and their dependent children. One proposal calls for taxing those as income. I don’t see proposals for taxing “employee discounts” in other industries, though. Just education. Given that a dollar is a dollar, one has to wonder at the real motivation. Granted, the impact would be greatest on graduate students; in many cases, their tuition waivers are nominally higher than their stipends, so their tax increases would be by several multiples. (Eventually, that could reduce the pool of adjunct faculty, driving up costs even more.) It could also do a number on our non-credit corporate training side, where employers often pay for employee training. Given that we keep hearing about the need for colleges to be responsive to employers, that’s counterintuitive at best.
All of this in the context of bills that allow a tax deduction for private plane maintenance. Honestly, I couldn’t make this up.
Over the past few years, I’ve mentioned repeatedly that one reason for community colleges’ struggles is that they’re built to create a middle class for a country that no longer wants one, or that no longer understands where middle classes come from. I’ve been accused of hyperbole for that, but based on the bills currently under consideration, I’m actually guilty of understatement. These provisions would be devastating for public higher education and for the people who need it. And they’re specific enough that it’s hard to ignore motive. They’re direct, precise attacks on people who devote their careers to creating a middle class.
At some level, we either believe in a middle class society or we do not. I do. I hope enough others do, too.
That’s how it looks from here.
Thursday, November 30, 2017
This one is a bit of an evergreen, but ‘tis the season for chopping down evergreens. I’d like to chop this one down once and for all.
We have online classes for which all of the graded work is done online. But we also have online classes in which students are required to be physically present, either on campus or at a designated (and sometimes expensive) testing center, to take tests.
Every single semester, we wind up with disgruntled students arguing for refunds for online classes when they discover, upon getting the syllabus, that they have to be physically present at a given place and time to take a test. As the students explain repeatedly, part of the reason they chose an online class was precisely so they wouldn’t have to show up. Sometimes it’s a medical or physical issue and sometimes it’s a transportation issue, but either way, the “online class” label feels like false advertising.
They have a point.
When I ask the faculty whose courses require onsite testing why they require it, the answer is nearly always a concern about cheating. In a classroom or a proctored testing center, they argue, most cheating can be either deterred or caught; online, though, students can get away with a lot. Academic integrity matters, so they just can’t bring themselves to go fully online.
They also have a point.
Which puts me in a tough position. Academic integrity absolutely matters, and I have no illusions that all students are as pure as the driven snow. But I also have to agree that requiring students to come in for a class advertised as online feels deceptive.
We’ve adopted a “lockdown browser” that prevents a student on a given device from looking at anything else on that device except the exam. We even have a system that uses the student’s webcam to take still photos at unannounced intervals during the exam, making it very unlikely that a student could consult a second device undetected. But even with our own mini panopticon, some faculty remain unconvinced.
Philosophically, I’m uncomfortable with the idea of just issuing some sort of diktat about what they can and can’t grade. That gets to a level of interference in a class that I would resent deeply if it were imposed on me. But the issue of “false advertising” is real.
So I’m looking at two ideas, and hoping my wise and worldly readers have better ones.
The first is to list any section with required onsite testing as “hybrid,” rather than “online.” It comes closer to the actual truth of the matter. Save the “online” label for sections that are purely online. It strikes me as a way to preserve academic freedom while finally putting to rest any claims of misleading advertising. If you want to require onsite exams in your online class, that’s fine, but you have to label it a hybrid. Fair is fair.
The second is to help skittish online faculty come up with better ways to assess student work, so the idea of proctoring becomes irrelevant. Many classes have long involved papers that are written outside the view of a proctor, so it can be done.
So I’ll throw it open to my wise and worldly readers. Have you found, or seen, innovative ways to assess student learning online to get around the dilemma of the proctored test?
Wednesday, November 29, 2017
The Girl is Published!
I’ll get back to my regularly scheduled meditations on higher education, but today calls for some unfiltered parental cheerleading.
The Girl is 13, and in the 8th grade. At the behest of the advisor to the Publishing Club (!) at her school, she wrote an entry for the New York Times’ “Best Books of 2017” feature. Her post, as it appears there:
I’ve read a variety of books throughout 2017, but “Carry On” by Rainbow Rowell stands out not only as my favorite from this year, but of all time. The book is everything I want to be as a person; clever, funny, sweet, and interesting. The characters worm their way into your heart and soul, and yet manage to be believable. Each one flips a character stereotype on its head in a beautiful way. The “chosen one” of the story is horrible at magic, the “evil British vampire guy” is sweet and in love, the “smart female best friend” actually has flaws (wow!) and the “beautiful girlfriend of the main character” is, in my opinion, vile. Despite my opinion, she is extremely realistic.
Reading it in one seven and a half hour sitting during the summer was not my best decision. If I had known that it was a work of pure genius, I would have savored the book, tasted it like a five course meal, and finished it with a content sigh. Instead, I stayed up until 1:26 am and wasn’t able to think about the book without internally shrieking for another week, but there are worse things, I suppose. There are worse things than falling in love with a book.
Looking into 2018, I’m looking forward to reading any new book by Rainbow Rowell, having read all of her past ones and loving each one of them. And while a part of my heart will forever remain tucked into the off-white pages of “Carry On,” I will keep my soul open, ready and waiting for a different book to sweep me off into a newer, brighter world.
I’ll admit parental bias, but I’m insanely proud of her. She’s a voracious reader -- I’ve written before of our trip to Comic Con to see Rainbow Rowell speak -- and even at thirteen, you can see a writerly voice starting to emerge. It sounds like her. “Clever, funny, sweet, and interesting.” Yup.
“There are worse things than falling in love with a book.” Yes, there are. Yay, TG!!!!!
Tuesday, November 28, 2017
Letters of Recommendation? Still?
The U of Venus bloggers did a good exchange at IHE on the value of letters of recommendation in academia. I’ll throw in a perspective from my corner of the world.
As an open-admissions college, there’s no need for letters for students to get in, and we don’t ask for them. For faculty, staff, and administrative hires we’ll ask for names and contact information for references, but we only contact those of people on the very short list. Nobody wants to wade through 50 to 100 sets of letters for a single position, especially given how unrevealing they tend to be. For the references we actually check, HR reaches out by phone.
In the past, I’ve been the one to reach out by phone. I’m not asking for more work, but you can learn things in a live exchange that you might not pick up from a carefully sanitized letter.
I had one candidate for a teaching position who supplied three names to call. When I called one of them and explained that I was calling to follow up on a reference for the candidate, he asked “who?” In another case, when I asked about any reservations the person might have about recommending the candidate -- typically, a gimme -- I got a long pause followed by a tremulous “I’m not comfortable answering that.” Coming from someone the candidate himself chose, that was striking.
The silences were often louder than the words.
The reason this version of reference checking works, I think, is that we’re asking it to fulfill a different function than many places do with faculty searches. We don’t use references to winnow down the applicant pile. We winnow down the pile based on our own criteria, followed by performance at the first round interview (which, for faculty, includes a teaching demonstration) and the second round interview. Reference checking in this system isn’t about seeing who had the biggest name advisor; it’s about making sure that the person who wowed us at two rounds of interviews doesn’t have some Terrible Secret we should know.
In other words, good references wouldn’t get you a job, but bad ones could lose you a job. They’re about verification, rather than distinction.
Good reference calls are really quick. Less-good ones usually take longer, as they should. I’ve seen that when I was the one giving a reference, too. I’ve had the good luck to have worked with some terrific people over the years. Every so often, one of the real stars applies for something and asks me to be a reference. Last year I got a call for a former colleague whom I consider a rock star; I don’t think the conversation hit the two-minute mark. “I’m jealous that you get to hire her and I don’t” doesn’t take long to say.
Oddly enough, the one place I’ve been where we used letters in the first round was DeVry. I remember not knowing how much weight to put on most of them. Does a relatively brief letter indicate a lukewarm endorsement, a pithy writer, or a different culture? Later I saw reports of studies suggesting that gender and racial bias creep into letters, which wasn’t really surprising. About ten years ago a favorite colleague -- a high-energy woman -- asked me to write a letter for her application to a doctoral program. It took me a few drafts to find language that conveyed “high energy” in a positive way that didn’t set off stupid gendered trip wires. It worked -- she got in -- but the fact that it took conscious effort to avoid those trip wires was revealing in itself.
I know academia isn’t quick to change, but I wouldn’t mind at all seeing the old tradition of letters for everybody go the way of the typewriter. It’s a vestige of an earlier time, rife with bias and light on useful information. A few live conversations do much more good. Let the candidates shine, or not, on their own. Just be sure to listen carefully for the silences when you call to verify.