Hi Chris, thanks for the topic.
In the "
Good old days" it was called "
Billable time."It was usually assigned to individuals within technical functions, e.g., Civil; ME, EE, Arch., etc.
The background for this was usually a project within which one played a supporting or leading role.
Time sheets then recorded the time billed to a project, and that was compared to the project's
planned budget (see
"Earned Value Management"). The comparison provided evidence that the work planned for the project was either
ahead of schedule, at schedule, or behind schedule. Then mgt. could make suitable adjustments to get the work back on plan.
It offered guidance to each and all what the org's expectations were to remain profitable,
continuing in business which included many benefits, e.g. College Tuition.
The range of this ratio, in the past, was generally from 70% to 85% based on the role one played in their group.
BTW,
"Enforced" feels somewhat punitive.
e.g., if one's billable time went over or much under the set targets, their manager would inquire to learn what's up.
Cheers,
Bill
------------------------------
William M. Hayden Jr., Ph.D., P.E., CMQ/OE, F.ASCE
Buffalo, N.Y.
"It is never too late to be what you might have been." -- George Eliot 1819 - 1880
------------------------------
Original Message:
Sent: 11-01-2022 06:16 PM
From: Christopher Seigel
Subject: Utilization Rates And Their Effects on Companies and Employees
I was talking to an engineer from a different company recently. They mentioned that their employer required that staff at their company have a very high utilization rate (it was somewhere over 95% but I do not recall the actual number.)
(For those who may be unfamiliar with the term, a utilization rate in this context is the amount of an employee's hours that are billable to a client/generating income. Essentially, doing your technical job).
From an efficiency perspective I can see the value of utilization rates, as they can be an easy way to determine if the amount of work currently available is too much/too little for the amount of existing staff. It can also prevent one staff member from going to disproportionally more trainings or conferences than other staff members. And I recognize that even if there is no official utilization rate that a staff member is held to, there is naturally a certain amount of hours everyone at the company needs to work (on average) to turn a profit and keep everyone employed.
However, I can also see ways in which this can be abused, particularly if a new hire is not familiar with the concept. For example, if an employer provides 160 hours of PTO in a year (lets say 2 weeks of sick time and 2 weeks of vacation), but also holds staff accountable to a utilization rate of 95%, then the employee will not actually be able to take all of their PTO (I am assuming 2000 working hours per year in this case). In some cases, it can also presumably force employees who aren't eligible for overtime pay to work overtime in order to be allowed to take vacation or sick time at a later point. There are ways around this, such as by not including PTO/holidays in the count for utilization rate and only using it to track an employee's workable hours vs the hours spent on trainings/conferences/etc. I understand that in many places, tasks such as proposal writing is not billable and therefore will count against an employee's utilization rate as well.
I was wondering if anyone else had experience with clearly stated and enforced utilization rates. Are they a net positive or negative for the company or the employees? What is their impact on staff morale/turnover/efficiency/training opportunities/etc?
------------------------------
Christopher Seigel P.E., M.ASCE
Civil Engineer
------------------------------