I have seen a trend with my STEM connected colleagues over the last 6 months wanting to discuss concepts of adaptive or competency based learning, CBL. These discussions evolved for many reasons such as; lack of classroom space, course scheduling problems or issues surrounding non-tenure track faculty. This discussion is right on target when it occurs with younger faculty, however, now older faculty are asking questions and seem to be contemplating how this could work. Generally there is agreement that it is inevitable that education will move in this direction, but then you start talking about the repercussions of what that might look like to the higher education business model and fear returns to the conversation. I guess what is different is that now there seems to be recognition of the value of the learning model and discussions are tending toward how we might implement it.
I decided to write this post after a number of discussions yesterday, some stimulated by those who viewed CNN’s airing of the “Ivory Tower” documentary. As we talked about implementing adaptive learning to our STEM courses I was drawn to the vision of the old one room schoolhouse. STEM possibly more then any other academic discipline is based on building blocks or competencies. Math and the sciences dominate this with competency based requirements built into courses as well as with interdependencies between courses. So when I thought of the one room schoolhouse I saw it as similar to the students that we receive. In the one room schoolhouse students have to progress through levels of reading, writing and arithmetic, and they had a built in remediation process. The teacher was there to help at all levels.
How did we get to our current college degree attainment path based on taking a selected number of courses that may or may not actually give you all of the competencies that you or your employer desire? I think we used to have a much more standardized entry path to college. Students from high school, mostly Americans, had very similar competencies due to similar curriculums that could not be supplemented by additional information as is now available via the Internet. The over achievers could go to World Book, but for the most part if a student got accepted to college then they pretty much entered at the same level and the progression through a standard set of courses with a few electives worked fine. That world no longer exists. We have screwed up high school believing that standardized testing validates competencies. Combine that with the financial pressure universities are under to maintain enrollment and you end up with a freshman class that is much more in line with the one room schoolhouse.
Change is coming and it will be heavily influenced by competency based learning and I think STEM may be well positioned to adapt to this. We have been working on this concept in our general ed core curriculums of math and science. At first it was about trying to figure out online or hybrid learning but now we are starting to see how we may need to change the academic business model. The emerging CBL providers such as Western Governors are built upon a personalized learning foundation that allows the student to progress at their own pace. Tuition is based on a period of time not on credit hours, which creates the incentive of “the faster you progress, the more you save”. Maybe there is a hybrid version of this that can work for the traditional residential university.
I’m going to take a stab at what this might look like for STEM degrees. I’m looking at this as realist considering what might be acceptable for our entrenched higher education culture, today’s student and the political and financial forces that will inevitably force the change. The first 2 years of most STEM degrees are fairly similar based on the need to build a foundation of math through calculus, basic concepts for the sciences with English and physics typically being foundational as well. This is true for pre-meds through engineering and it is typically fairly challenging to ensure that we are not wasting our time on the students in the upper level of the degree program. So how about a one room schoolhouse for each STEM discipline complete with a set of competency based learning modules designed with assessments that provide adaptive options to complete each step. We have talented non-tenure track faculty always available and still teaching but not on a fixed lecture circuit. The environment would facilitate collaborative learning along with the necessary lab requirements. The student pays the same tuition, and heck we even keep the semester structure. The advanced students finish early or have more time for extra curricular activities such as undergraduate research or experiential learning options. As the student emerges from this general ed core they enter into the more traditional degree completion with the upper level courses and labs taught by tenure track faculty to complete their STEM program.
I’m going to stop here without digging into the obvious questions and details. But what do you think? I think it might be an improvement.
While eating lunch at my desk I opened up the webcam view of our new Nonavitra 6K Visualization Wall we built for use in the library. Three students jumped on the system and proceeded to spend 15 minutes exploring chemical bonding options starting from some periodic table application. I wouldn’t say that it was utilizing hi-res graphics but what was important is that the students were having such a great time exploring. This brings me to what I feel is one of the most important reasons for giving our students access to this visualization resource. The opportunity to explore and gain experience in working with resolution that is typically reserved for corporate showcases or expensive research facilities.
The library had an open house a few weeks ago where they introduced Nonavitra and ever since we have seen the reservation schedule for the resource fill up with student groups especially in the evening. In fact one of the first uses for the wall was the rugby club using it to scout a future opponent. But what I love is that student study groups are reserving it.
In the beginning my Research Support team started bugging me to allow them to build some sort of a visualization facility. They wanted to build an immersion visualization experience reminiscent of CAVE2 at the UIC’s Electronic Visualization Laboratory (EVL). And yes that would be fabulous but we need to walk before we run, which is why user adoption is the overriding requirement. Last year’s V4DiR focused on 3D data review and the Nonavitra Visualization Wall now allows us to put a powerful visualization resource in the hands of our faculty and students. The one condition that I set was that I would not build a visualization resource that would become relegated to providing campus visitor demos. We seem to be having success with these technology rollouts. The 3D Printer program in the library has been extremely successful. The secret to success is to put your effort into engineering the business process for making the resource available.
I was outside recently watching our Research Support Student employees fly our helicopter drone over campus capturing some great autumn video.
I also had my dog, Abby, with me since I designated it to be “bring your dog to work friday”. Well Abby was generating some interest from students homesick for their pets while I got into a conversation with a couple of our IT Support Services Student employees about their programming ideas. These ideas come from our encouragement for our student employees to explore ways that we might improve our business processes. The idea was about a web app that the student workers could use to trade shifts with their coworkers. The gist of the conversation quickly focused on their perception that IT only recommended development based on Perl. Well he mentioned this to the right guy, actually the boss, but really what a ridiculous perception that obviously had roots from the past. But that type of preferential influence will not fly today.
Perl does have a significant development presence here and there is nothing wrong with Perl, but that should not dictate the requirements of future development. The student asked if he could develop in PHP but was told that PHP was not secure. Well maybe that a general statement with some merit but probably not of concern for a student employment shift sharing application. The student actually wanted to use Python and then our conversation steered toward new ideas like the possibility that WordPress might be utilized. My major point with bringing this up is that we in IT have to be cognizant of the influence we convey and that our way is not the only way. IT should remember when they were the radical adopters of new application platforms. Consider the fights they must have had with cobol and fortran proponents.
Reprint of IT Facilitating Teaching and Research: this article of mine that was recently published in CIO Review Magazine’s Education Edition offers my observations about how IT needs to take a more deliberate role in not just supporting academics and research but also to stimulate it. I used our example of investing in a SAP HANA Appliance and committing staff resources to support the use of it for teaching and research. This has proven to be beneficial even without the opportunity for our Business ERP students to get hands on experience or the Autism Spectrum Identification research project now using it. It has been successful by just expanding our staff expertise in this area.
Now we have a few more of our investments showing potential and all of these solutions have been developed by our student employees. We built a 8×10 ft 3×3 video wall that our students named Nonavitra that we have located in the library so that everyone can use it. Here is the Library Guide describing what it is and the procedure for access. We are seeing use range from the ERP business students presenting their SAP Dashboards, examination of gigapan photos from geology, Electromagnetic Compatibility display from ECE, students exploring NASA Eyes to the Rugby Club using it to watch a video of an upcoming opponent.
We are also looking at mounting a LiDAR camera on our new helicopter drone. Again another example where IT provides a tool that can stimulate research, but IT also owns maintenance and operator expertise that is critical for taking advantage of the drone. We have built a digital signage solution that we call MinerBytes based on the Raspberry Pi computer that is being deployed throughout campus. We are also close to finishing our production version Segway. I will save the Segway story for another post, but again another project which provides incredible experience for students that just might translate into something that can benefit our university.
My RTD review post as promised is mostly to confirm how satisfied we were with the success of the conference. Attendance doubled from the previous year, the Keynote presentations were right on, the sessions were valuable and well attended and the Fireworks were more spectacular. This is not your normal Research and Technology Development Conference because Missouri University of Science and Technology is not your normal research campus. The difference focuses around S&T’s need and desire to collaborate with regional research universities. Missouri S&T confirms this focus by going above and beyond to throw not only a rich professional conference but also an extremely enjoyable experience. All this comes at the expense of S&T staff and partners working hard, but driven by the rewards.
Wesley Chun, from Google and author of the “Core Python” Series of books kicked off with a Keynote presentation that helped everyone understand the value of embracing new technology even if it might be disruptive. Mark Suskin, PhD, from the NSF Division of Advanced Cyberinfrastructure followed up with a reflective look at our traditional public research funding model and asked us to offer new ideas for what that model might look like.
The three pillars of Computational Science, Additive Manufacturing and Large Scale Visualization provided very distinct and engaging conference content areas. Many new professional connections were established for those seeking information about these pillars.
The talk of the conference again will be the Monday Night Social Event. Anytime you offer some of the finest BBQ in the region along with a selection of local beverages and the finest Pie in the land you have a winner. Follow that with a custom fireworks show put on by your own Explosives Engineering department and end it with a high energy rock band and you greatly improve the technical conference experience. Yes it was a great time, Thank You to all who were involved.
Checkout the Time-lapse of setting up the Video Wall
A benefit that I thoroughly enjoy from being the CIO at Missouri University of Science and Technology is the opportunity to promote, support and participate in research activities. The capstone event that represents IT’s involvement with research is our “Research and Technology Development Conference”, RTD2014, that takes place next week, September 15-16. Putting together a conference such as this is an incredible amount of work which tells you that last year’s event must have been successful or we would never have committed to another year. Actually that is true, at this point in the life span of RTD, 4th annual, a negative outcome would cause us to abandon the effort for the following year. But no, this year’s RTD will be more amazing and we will probably be motivated to continue the tradition.
Keynote speakers this year include:
(Monday) Wesley Chun, from Google and author of the “Core Python” Series of books
(Tuesday) Mark Suskin, PhD, from the NSF Division of Advanced Cyberinfrastructure
The conference content is focused around three major pillars:
Computational Science, led by University of Oklahoma and University of Nebraska
Additive Manufacturing, led by University of Louisville and Missouri S&T
Large Scale Visualization, led by Indiana University and University of Texas
Monday morning kicks off with various workshops focused around our Pillars and a very popular Python workshop provided by Wesley Chun. The sessions scheduled for Monday afternoon and Tuesday morning will be showcasing the latest developments in the 3 Pillar areas driven by the lead institutions.
In addition, the RTD Monday Night Social is a networking event not to be missed. In keeping with S&T interests, there will be catered BBQ (from various vendors across Missouri), music, and an incredible fireworks show by the S&T explosives experts.
So why do we do this? The original driver for such a conference is the need to create an opportunity for your university research community to collaborate. Typically this is called Cyberinfrastructure Days at many research institutions and it is required if you receive government research funding. Last year we decided to expand this to more of a regional event thinking that it would promote more intercampus collaboration. It has definitely stimulated more regional research collaboration with respect to sharing resources such as HPC. The potential for RTD justifies the investment especially with help from our vendor community, but it is our amazing staff and students who make it successful. We plan on launching a number of new projects after introducing them at RTD. I’ll follow up with a recap post.
I try to discuss innovation and disruption in higher education on my blog. However, it is difficult at best to dig too deep into these areas since I am digging from the inside. That sounds a bit like digging your own grave and I’ll just leave that comment hanging. But I have been accused of being the most innovative and too innovative and because of that I must carefully manage that perception as it relates to disruption. Higher education as it is primarily established today cannot handle the disruption which tends to evolve from innovation. Very sad really, it means that any innovation in higher education must fit into the existing structure which tends to predict its doom. But it is that structure that is predicting higher education’s doom.
My motivation to open up this topic comes from my increased interactions with our corporate partners looking to hire our students. This is a good thing that we have corporate partners who want to build a relationship with us because the trend is not necessarily moving in that direction. Two recent Gallup Polls revealed that although 96 percent of chief academic officers believe that they are doing a good job of preparing students for employment, only 11 percent of business leaders agree that graduates have the requisite skills for success in the workforce. I hear the same concerns but thankfully we do produce graduates that are acceptable to employers but we cannot rest on our reputation. The skill sets needed by employers is changing much faster then our curriculums.
It is commonly accepted that higher education is approaching a bubble of dramatic disruption. Theories on what that might look like range across the spectrum typically dependent upon what role one plays in that industry. But when you step back from personal feelings it is hard to understand how this system designed centuries ago can continue much longer without some serious overhaul. Of course change or innovation rarely occurs from within, it will be outside forces that create the bubble. Those forces evolve from our customers and the options that they explore. I think the most significant force will come from the employers of our graduates. The Christensen Institute has helped alert us to disruptive signals over the years and I think they have produced an excellent review of how our employers are shifting their tactics in their latest publication “Hire Education”.
The publication as mentioned in the video shifts focus to an examination of online competency-based education. Unfortunately for our traditional institutions of higher education online competency-based education would probably have the most disruptive affect imaginable on our current business model. I do sympathize with the overall value proposition that higher education offers and we should not lose what is working in HE, but I think we know that change is coming, so shouldn’t we we planning for it. Read the “Hire Education” report with an open mind and consider how we might adapt our credit hour, semester based approach to conveying a degree. I am fascinated by how we might adapt our ERP systems. I could see year round college campuses where you protect all that is great about a residential and experiential learning college experience. Maybe some of the students are working in a competency based track and given support from subject matter experts and academic staff. It may not be the tenure track dream job but it could still be an extremely rewarding alternative.
The pursuit of a STEM degree has gained significant attention in recent years as we evaluate the ROI for a college degree. A recent article in NerdScholar by Yesenia Rascon, “Top 5 Reasons to Apply to a Research University” highlights the importance of experiential learning, access to research facilities and hands on career development quantifies many of the reasons we allow our IT student workers the opportunity to participate in exploratory projects. This all relates back to a culture that we promote for our very successful IT Research Support Services, RSS, group here at Missouri S&T. I have been fortunate to be in a position to carve out some IT budget to dedicate to research support. However, because some of my funding comes from student tech fees I make sure that the students benefit from our efforts. This translates via the hiring of student workers, but extends beyond tradition tech support jobs. We hire students in RSS who seek out that opportunity and we benefit from important support services that they are able to provide to our university. However, we also reward them with the opportunity to own their own research projects. Our staff does offer advice and support but we also let the students fail.
Our students also earn the right to attend national research conferences such as the annual SuperComputing and Great Plains Network. These opportunities provide them excellent presentation experience which we utilized this summer by having our students conduct a workshop for the CyberMiner camp for high school students. We asked them to present their current projects to about 50 high school juniors and seniors. We designed the workshop to encourage the campers to engage with our students and it was truly an inspiration Geekfest showcasing our future technology leaders.
Here is a quick glimpse of the projects they presented and a sense of the workshop.
MinerBytes which is a digital signage project based on using the Raspberry PI computer connected to any monitor with access control given to designated administrators. This was a project conceived by a biology student last summer and this summer we are preparing it for version 1 production deployment on campus and in our community. Somewhat of a surprise to us was that this project generated the most interest by the high school students as they were intrigued by the coding behind MinerBytes.
The Helicopter Drone Project is in its infancy which was good to be able to show the campers how a project gets birthed. We don’t know where this project will go but we believe we should be on top of the explosion in use of drones. We have ideas for using it in creating virtual tours.
The Segway project started out last summer and has proven to be the perfect multi-discipline opportunity for our students. With a heavy electrical, mechanical and software development component we have had many students involved with this one. Our students presenting the Segway gave the campers some excellent advice based on their experience in designing the controller boards which they fried more then once. They told the campers what they appreciate most about their opportunity to work on these projects is that they are allowed to fail, and that has been their greatest learning experience.
The Segway prototype moved to a production design this summer which offered an excellent opportunity to display how they used SolidWorks design software on the new Video Wall that RSS built this summer. The Video Wall currently named MinerView is built on solid computer video display principles but was built from scratch with special attention given to the structure to mount the 9 55 inch high resolution monitors. The students had just a few hours to assemble the video wall in the classroom used for the workshop.
The Video Wall will be used in the upcoming Research and Technology Development Conference, #RTDatSandT on September 15-16 where representatives from Indiana University and the University of Texas will show off the latest in visualization techniques. RTD2014 is another great opportunity for students at S&T.
Of course the Video Wall has many uses and will be an important addition to our Library where it will be made available to the entire campus for visualization. We already know that it will be instrumental as a foundation for our Business and Information Technology department’s ERP Center.
Hopefully this gives you an idea of what is possible if your Information Technology department combines the needs of the university with an opportunity for experiential learning.
Just returned from my best backpacking trip ever in the Goat Rocks Wilderness area in between Mt. Rainier and Mt. Adams in Washington. I was there July 27 – August 1 and the weather was as perfect. The mosquitoes were a bit aggressive but manageable. Overall the trip probably included about 33 miles and 5000 ft of vertical.
This video of the falls below Goat Lake tries to convey the enormity of the area.
I leave tomorrow for my annual backpacking trip. This will be five days in the Goat Rocks Wilderness Area in Washington. Going totally offline is a good thing to do. Last time I did this I remember having over 1000 emails to deal with, but life goes on without us. The backpack is ready weighing in at about 37 lbs and I am in fairly good shape having been able to do a few short mountain day climbs this last week. The Cascade Head hike offers spectacular coastal views with a very steep vertical climb. Hart’s Cove ends out on the coast in a secluded cove but it is all down hill (1000 ft vertical) to get there. When I return I will post a trip report to the Portland Hikers Website. Here is a trip report from my first backpacking trip to the Three Sisters Wilderness Area near Bend, OR.
In my many years as an IT leader in Higher Education there has always been a relationship with corporate partners who are looking to gain a recruiting advantage for our graduates. When I was at the Christian George Fox University the recruiters would come through me looking for any tech savvy grads I was aware of because they desired their solid work ethics and integrity. Here at Missouri S&T the recruiters are looking for an advantage in connecting with our best students. Problem here is that we do not produce enough graduates so the recruiters are looking for any opportunity possible to lure the student to consider their company.
We know that employers of our tech graduates in the US desire that our grads possess stronger communication and collaboration skills. And I think it is understood that more technology awareness is desired for all graduates. But it has been interesting to confirm feedback from recruiters of our S&T grads that defines how highly they prize our none STEM graduates. Yes, we do produce some graduates with degrees in the humanities, and they are sought after because they are forced to have a strong technology based foundation. This is partly because of the general curriculum requirement of at least 10 natural science or mathematic credits. But the employers say it is also because of the technology culture of the campus which forces those grads to become extremely comfortable working with their fellow STEM students. Due to the many cross discipline group projects our humanities students learn valuable skills in how to work with these sometimes socially challenged STEM students.
Not sure if this justifies anything but it sure can’t hurt to consider greater exposure to STEM curriculum and culture for all college graduates. We do need these humanities grads to help those scientists and engineers have more productive careers.
This recent Bloomberg article “Silicon Valley’s Talent Grab Spawns High School Interns” should be a wake up call for Higher Ed’s inability to produce enough product. Reality is, why wouldn’t tech firms get their recruits on the front end. Of course that is what we should be doing more of.
“Big Data” has definitely become an overused term eclipsing the barrage of vendor connections to sell new solutions. It seems like any group that is dealing with data is now referring to it as “Big Data” and in some situations like large research data sets the term is technically correct. The actual definition “data sets that are too large and complex to manipulate or interrogate with standard methods or tools” does create a broad category. I think of “Big Data” from the manipulate or interrogate standpoint that requires techniques to manage (hadoop) and process the data (MapReduce) using computers with large amounts of RAM. And it gets very confusing as we apply our traditional relational DB and BI concepts. But I’m not the one worrying about how it works, I’m trying to figure out the most effective way to make it work, and that relates to skills, budgets and data centers.
A major stimulus for “Big Data” visibility at Missouri S&T is our commitment to offer new Graduate Certificate Programs through Distance and Continuing Education. This has created a flurry of activity in the Computer Science, Computer Engineering and Business Information Technology programs with respect to the creation of new courses and the associated support of “Big Data” teaching resources. We also have significant growth in the need for high performance and throughput computing so I ask why can’t all of the computing hardware be more effectively utilized across these disciplines. Maybe it can, but today we approach the challenge with our traditional operational methodology and the solutions don’t play well together. I was recently encouraged to find out that others in higher ed are exploring this terrain of HPC and hadoop operations. One of our collaborators, Kansas State Beocat, is running into resource scheduling challenges, but they hold out hope that there must be solutions.
So what can we do with a meager budget and limited infrastructure to become a player in “Big Data”. We start with enhancing our skill sets by adapting our traditional DBA talent to hadoop concepts and we steer our analytics specialists to experiment with these new BI tools. Luckily it is affordable to venture into hadoop based data management and there are plenty of open sources BI add ons to get your feet wet. This is building a strong foundation that may produce valuable breakthroughs for more effective teaching and research. But we are going to take this one step further.
Working with hadoop may establish some “Big Data” concepts that relate to the commercial space, similar to how working with mysql may simulate Oracle DB principles. But is that enough? Does higher education need to be offering teaching and research for what our employers use. I recognized a disconnect a few years back when I took over teaching an “Information Services” class for our business school. They had been teaching basic concepts of spreadsheet, programming and database to students that were being groomed as bean counters. I instead taught them basic concepts of ERP, CRM, BI, DW and had them actively participate in the web by way of blogging and understand SEO. The motivated students thrived and the others survived. I did get some validation from this approach when one of those students now pursuing her MBA commented how far ahead she was because of her understanding these real world solutions.
I mention this correlation between what we teach and research vs what the commercial world relies upon to explain why I am purchasing an SAP HANA platform to support teaching and research at S&T. Today I would equate HANA as the leader for the utilization of “Big Data” in the commercial sector. Sure it is based on hadoop but it is a fine tuned appliance specifically designed to produce results for the “Big Data” market place. I am finally ready to make the purchase but it has not been an easy process. I first got the idea when corporate partners who are always trying hire our SAP ERP trained business students mentioned their need for HANA experience. I then equated that to “Big Data” research partnerships especially with our engineering projects producing large amounts of diverse data. We uncovered some of this with our visualization efforts. But I could not find anyone at SAP that knew how to sell me a HANA solution that was not based on a commercial vertical market. Thankfully Hewlett-Packard who has a strong relationship to the HANA hardware appliance saw the opportunity. They had customers all around us who were cautious about committing to HANA because of the lack of qualified talent to drive it. HP saw the potential of S&T graduating students with actual HANA experience so they helped connect us to the right people in SAP to make this happen.
Is this investment in HANA strategic? That is my hope, but at a minimum I do believe that there will be tremendous value from the exploration. Any exposure for the students will be a win at least as long as HANA remains a commercial leader. And I believe having our own HANA system will open doors for corporate research collaboration by helping us to overcome licensing and intellectual property challenges. The side benefits may be the help for us in understanding how to position “Big Data” processing into our HPC mentality. Or applying this experience to challenges we have in managing our own cyber security, learning analytics, retention and recruiting. Maybe the greatest value is to help our academic culture explore a different path.
Update – 6/27/14 – Support for the HANA purchase is strong so we have moved forward with the purchase.
I was chatting with one of our professors and our conversation ventured into the importance of mobile devices. The topic related to why it was so important for Microsoft to gain a foothold in the mobile phone market and I explained to him the intricate connection between the consumer’s phone and their computing platform of choice. But I also told him that the mobile phone would someday be the most important component for authenticating identity which is critical for financial transactions. I’m not sure I knew exactly how that was going to play out but it is always fun to stimulate non-techies into imagining what the future might hold. I did tell him about how important cell phones were in Africa for providing a means of transferring money. So it was a natural assumption to connect the cell phone to the online or digital economy as a means of providing more secure form of authentication. And when you talk more secure you typically relate that to a dual form of authentication based on something you have and what something is better than cell phones. Anyways, this conversation led to being asked to give a talk on this topic for the local Rotary.
I relate this conversation as a lead in for the story today about how Apple might offer a means for how we pay for stuff. Apple is hinting that it may explore this territory of payment services and that the fingerprint authentication on the new iPhones was implemented with this in mind. But the real impetus may be that Apple has amassed the most impressive number of personal accounts, about 800 million, that are connected to a credit card. This number is huge especially when compared to the next closest, Amazon’s 237 million. And what was the trick to getting this many purchase ready accounts? Music Downloads through iTunes. Yes, the convenience of impulse buying for a song that I hear justified my synchronizing my credit card with my iTunes account. And I have been very pleased with the results; quick, efficient, receipt email, and trust. Yes trust, there has not been a significant security breach of Apple’s accounts.
So is Apple going to expand their payment services to include any online or even checkout counter transactions? Lot’s of issues that have to be worked out before that financial model is justified, but I would bet on it. I was originally thinking the mobile phone could provide an identity solution for verifying who you are using the 2 step authentication model. Apple has successfully expanded that to include biometrics which I think will inevitably be required in our insecure identity compromised world. Makes a whole lot more sense then offering a credit card and signing a receipt. Needless to say, control of the mobile phone market continues to grow in importance. The next authentication phase will probably involve scanning that chip they want to insert into our body, but I think for now we work from something that everyone wants to have on their body.
The blog post by Ian Cox about his new book “Disrupt IT” motivated me to offer some reflection on the type of IT Disruption that I have needed to employ for my slice of Higher Education. I have not read his book but I can tell that I would agree with his premise that IT has become the change agent. It is easy to connect technology to why change has accelerated in recent years. But change is not accelerating in Higher Education. Be clear, we do not need to change because of technology, but it is technology that has highlighted the need for change. And that is where IT may be the perfect change agent for Higher Education.
Higher Education is still avoiding the real technology elephant in the room, the “Internet”. We deal with a whirlwind of questions about how students learn and why does college cost so much and why isn’t it about students getting jobs. Maybe we should use more technology in the classroom or software to manage our student success. But it does just come back to the fact that Higher Education no longer controls the data which is converted into information which can become knowledge for anyone motivated enough to absorb it.
OK, back to disruption. I came to Missouri S&T because I wanted to make a difference in Higher Education for the STEM segment that I feel is critical for our future. I do believe IT needs to be the change agent and doing so at such a technology dominant university is the perfect challenge. Yes, I inherited an IT service model that was catering to our traditional decades old higher education culture. And Missouri S&T is facing the same challenges stressing many public research universities in the US. Challenges of serving increased enrollment with an aging infrastructure using an outdated business model. How can IT help?
First you have to change the culture of your IT staff while also laying the groundwork to change the university’s relationship to IT. This is all done by building trust. IT staff that are working in the traditional control service model may be reluctant to breakout of that comfort zone. IT staff love to be needed and that old model offered that, but what about innovation? IT staff should be the innovation leaders or at least they should want to be. I believe that path to a successful IT culture change has to build from IT being innovative and gaining pride from how that innovation can impact the university. And the key to unlocking that innovative spirit in your IT staff is to show them that you mean it. Invest in their ideas or at least let them own your ideas. And above all, assure them that it is OK to fail.
Gaining trust from your university is more tricky since some of your customers are very content with the old IT support model which may still support their outdated business model. However, the important customers are the faculty. The reality for them is that their job has only gotten more difficult to perform. Teaching loads have not decreased and research funding is more and more scarce. IT offering support for teaching load tends to point toward the utilization of technology and exploring online delivery. But IT does not need to push any of that, IT just needs to offer assistance in utilizing it. IT does not need to push online learning to secure their value in EdTech support. They just need to offer support, faculty need the help, leave the politics of course delivery to the Provost. And IT support for research needs to come again as the assistance model. A researcher used to get a grant that outfitted their lab with technology that was managed by a grad student and had enough fluff to allow some breathing room. Today it seems like more time is spent submitting grant proposals then actually fulfilling the research of the successful grants. IT has to find a way to be a trusted partner so that researchers can sell that support to win their grants. This is a budget dance, but IT has to find a way to free up researchers to actually do research.
When IT appears to be achieving positive repositioning, some strategic disruption can put it all together. IT departmental reorganization will inevitably be needed, but turn it into an opportunity. Gain some visibility for IT on campus by offering support to a much needed service. That might be a service to students it might be supporting another service provider like the library. Don’t lead with a software service disruption, that will come later and will probably be IT’s greatest contribution, but total trust is needed for that.
One of my challenges in coming to Missouri S&T has been to leverage the most effective use of our meager High Performance Computing, HPC, capabilities to stimulate learning and non-funded research. This has been an ideal opportunity for myself to evaluate this rapidly evolving area of HPC with no predetermined assumptions. Some early observations were that we did not have adequate super computing resources, but it was also apparent that those with enough resources did not necessarily produce proportional results. What we did have was an understanding of what we would do if we had more resources. If I just focused on HPC I would find myself in a resource battle trying to gain recognition in the research community based on cores and compute capability. But we were also interested in visualization and then along came interest in “Big Data”. What I saw was an opportunity.
The one thing I did have was the foundation of an effective research support team which included skill in adapting HPC technique to fit differences in data and workflow requirements. I also had talented student employees who totally thought outside the box and exposed many new options for us. So we started to see that we could compete in processing by adapting our HPC resources to the jobs being requested. And it became increasingly apparent that we were dealing with data that benefited from some sort of visualization to help identify what we should be looking for. For example: we have gotten good at presenting large data sets graphically over time with flexible data attribute selection where we are just looking for anomalies. Now that we are also exploring “Big Data” I could not help but ask why the concept of large in-memory processing for hadoop based data could not be married with traditional HPC and supported by our flexible visualization.
It now appears that my first year of exploration is starting to take shape. I have strengthened my human resources and have discovered that the human element is the most scarce, or at least a flexible human resource team such as we have. So now I have some financial resources to invest and this understanding of the interrelationships of these research tools is helping to stretch what I hope to accomplish. Most of our HPC cluster is devoted to students so we need a base HPC investment devoted to non funded research. For us that goal is probably a 1000 cores. But our success is not going to come from those 1000 cores, but instead from the collaborations we have developed with neighboring university computing centers who realize that we have more to share then just HPC. We can help them optimize their 1000’s of cores specific to the computation desired. Good example here is in computational chemistry.
I mentioned exploring “Big Data”, which has become the darling of big iron computer sales. In simplest terms, “Big Data” is about managing large diverse data sets and processing it with large amounts of memory. The real driver of “Big Data” is the need to analyze the massive amounts of real-time data flowing in about customer buying habits. But of course we have been led to believe that all of our analytical investigations should be using “Big Data”. Not true for analyzing student data but can be true for analyzing some forms of scientific data. And guess what “Big Data” really means it is too big to visualize with traditional spreadsheet type tools. So I am thinking why can’t we blend HPC and “Big Data” with my new nimble visualization techniques? We have all the ingredients and the most important turns out to be the human factor. So now I am throwing some DBA’s into the equation along with scientific software engineers with plans to expand the visualization resources. We should be able to provide most of our processing needs locally or via sharing with regional partners. Add in efficient on-ramps to XSEDE and Open Science Grid and we can compete with anyone.