As someone who has spent a lot of years watching different trends in system design, I find the current trend of micro services very interesting. Many years ago I was setting in a system design class with an adjunct professor making me way through Systems Design II. The professor said, “the key concept to good systems design is to let the people do the people work and let the machines do the machine work.” I took his comment to heart and it has influenced just about every decision I’ve made as an IT professional during my career. It has not always been possible, but I’ve learned if you let the machines do the machine work, then the people are so much happier and more efficient.
Every few years it seems that programmers develop new methodologies or remake old ones into something new. This allows people to create applications that are ever-evolving.
It seems now that everyone is using micro-services in their design. So what are microservices? As the name implies, these are just small pieces of functionality that do one thing and does it well. Sometimes the microservice is just local to the main application and sometimes it can be another cloud service such as one of the many email sending providers.
This post is not designed to be a programming architecture discussion, but rather a management of technology discussion.
Colleges consist of multiple departments, and the direct measurement of how an overall college is functioning depends on how well those departments can exchange needed information. All function of the college depends on this exchange in some way, and if its taking weeks or even days for an applicant file from testing to a decision, then it’s not hard to predict overall applicant matriculation percentage will suffer. Thus, lowering the overall efficiency of the institution.
Four Year colleges have student churn, but its not near as intense as the community colleges face. Most community colleges are designed to provide training in support of economic development in a community. This means shorter programs and students are always applying to the college, so every term there is a rush if applicants who must be processed. Downturns in the economy causes an increase in the number of people going back to community college to learn a new job.
So now back to the Micro services concept. The main advantage that a college can use is that microservices are built to share data as a primary function and on handle one functionality. Expand that concept to the cloud and think about all the various cloud services from email providers, to text messaging, to storage blocks. This makes it easy spin up an efficient system to help pass this data back and forth and help process these applicants by improving communication to the applicant and notifying the staff of an applicant question.
Taking the micro-functionality approach you can build everything required to efficiently push applicants more efficiently through the application process of prospect -> applicant->tested ->documents submitted->awarded aid-> registered.
You just have to ensure the applications you choose can easily expose or consume data in the standard JSON format.
I can hear the security guys losing it about now. Wait, you are advocating sending applicant and student data into the cloud?
Nope, this is what you send.
Since you only need to send directory information, then the FERPA rules do not come into play like they would if you sent the entire applicant file.
Creating a view of your data that provides a list of your applicants segmented into each of the major areas all the applicant acceptance process allows you to automate the communication process for the next steps the applicant needs to take in order to become a matriculated student.
What are the major bottlenecks in the admissions process at community colleges?
Above illustrates a typical process for community colleges when admitting a new student into college. There may be more or fewer steps at your college, or you may just be thinking about things that are part of moving an applicant through the various processes listed above.
Each of the arrows on the above illustration shows a potential bottleneck to getting that applicant through the process. Any of them can prevent you from obtaining an enrolled student by the term start.
Each of the bottlenecks can also be measured and allow the statisticians to measure each of the major bottlenecks. This will enable you to apply Lean Six Sigma calculations or other formulas to calculate possible future process improvements based on past performance. Fabulous data for when the accrediting people come to visit. Showing evidence of improvement efforts are never frowned upon in those meetings.
Luckily all the steps in the admissions process have a standard action. The applicant must be told what the next task they need to perform is. If the applicant is serious about learning a new trade, then they will make their testing appointment and submit their transcripts by deadlines. But are they always told 100% of the time, and maybe even informed more than once. After all, the majority of community colleges consist of older adults, who have jobs, and kids, and whatever other crisis to handle. Sometimes they need to be told more than once in more than one way.
So, who does this work of reaching out to applicants at various states in the application pipeline to let them know the college is still missing their high school transcript?
Is it the people, who must run a Population Selection in Banner or other report and then send them a message in email. Maybe you have the Recruit module, which helps you, but a lot of schools have to manually pull data out of Ellucian Banner to send the students an email.
By using cloud services, functionality can be achieved by sending a simple JSON message to your cloud provider. Some of the cloud providers like SendInBlue or SendGrid allow you to update your prospect’s attributes through their web API as well as send emails. Both of these services also provide text messaging ability. Allowing, you to reach the applicants “Where They Are” on the vast worldwide web.
This allows you to segment applicants based on where they are in the process and enter them into workflows that will eliminate the need for human interaction until the applicant exists the workflow. It sounds like the machines are now doing the machine work, and the people can focus more on the people’s work.
If you are still with me, you are probably asking about the data. Expescially if you are an Ellucian Banner DBA who would be responsible for pulling out a single view of the applicants. I’m sure you have cussed a few times, maybe even thrown coffee on the monitor.
I can just hear it. Man do you know how many tables I would have to touch. There is saradap, sfrstcr, soatest,shatrans, and a crapload of those damn R tables. And I’m supposed to get that in a view? Man there are a multitude of paths an applicant could take to determine if they need to test or not. It would take me months to write.
Luckily, I already wrote in in a package that is designed in a way that capable of running across RDBMS instances. The EMAL, RSTS, TESC, codes or the TERM code does not matter. It figures out what it should be based on a script the DBA inserts into the DB.
Not sure what I’m going to do with this little bit of code, I may just share it with a fellow educator who would like to tinker. Just get in touch. I would need to comment it out better.
So I guess what I’ve been talking about does not totally fit the bill as a micro-service. I’ll just coin a new term. Let’s call them MicroFunctions and Packages of Functions. These types of services should be starting to make their way into every IT Director and CIO needs to start thinking about how to safely integrate the cloud functionality into their processes without violating FERPA. It’s just becoming too expensive to maintain those huge internal server farms. And writing the functionality you will receive with an email provider, makes running your own mail platform a legacy platform and behind the times.