Thursday, June 3, 2010
meet little big man
Meet Miwok - our latest foster. Miwok is 7 yrs old and was found abandoned in San Jose. He has once been someone's spoiled little prince - but by the time Hopalong Animal Rescue picked him up; he was filthy, dehydrated, lost 1/3 of his body weight and very stinky.
Miwok needed to be neutered and receive all of his vaccinations. He was microchipped, given a med exam, advantage and deworming meds.
When I picked him up that night - he was still groggy from the surgery, but he has recovered fairly well, is eating better, and is very happy to be indoors.
Like all pugs - he is stubborn and clingy and we are working on some basic commands like sit and don't jump all over me darnit. : )
Miwok will be part of the Maddie's Fund Adoptathon event taking place next weekend - where hopfully he will meet his new forever family.
Sunday, May 23, 2010
< 3 Makers Faire
Saturday, May 22, 2010
http://www.makerfaire.com/
I was able to convince a good friend to make the trip down to San Mateo fairgrounds, and will report back with all of the fun geekery seen and acquired.
Four things I learned from my HR Metrics mentor.
When you use numbers to demonstrate a point or explain a trend, be aware of how the metrics are used to tell a story, make certain it is the story you want to tell. Leaders will manage what they can measure - be sure the metric is concretely tied to the change you wish to see.
The key to believable metrics is clearly defined qualifiers and parameters. A savvy leader is going to ask the above questions. Present simple definitions for how you reached your numbers.
Rarely - the person who researches and produces the metrics is the same person who presents them to leadership. When you finish your metrics, place yourself in a leader's shoes and try to think of the 3-5 additional follow up questions. Place these answers in your speaker's notes or presentation. The director or manager presenting the metrics will feel more comfortable and will project more confidence. Leaders will have a greater level of faith in the metrics if they feel they can not "poke holes" in your arguments.
Wednesday, May 19, 2010
Next Steps become Big Steps
After our successful demo, my manager and I sat down in our conference room with white board markers to draw out next steps. We began by talking about our expectations for the project and began very shortly to see the expectations of us from the sponsors had changed during the demo.
We went from proposing a solution to owning a program. The feedback given to us by the National Learning Leaders showed they were so happy with our idea that we should "run with it." "It" being the whole process and program of 4D Follow Though - the development driven interactions with leadership training participants over a 10 week period - that would be taking place in 11 different regions, 50 individual classes containing 1,000 participants a year.
Wow.
Daunted but daring - we began to draw out what that would look like: what would need to be put into place in the form of policies, processes, training and resources.
First - we started by stepping through the known process. Where were the delta points? What would we need to change as we moved the process in-house with a new tool. Next, we began to lay out who would need to be doing these things. What roles would need to change? What new skill sets would they need to have?
We looked also at how we would handle change management. This new tool would require additional effort from the participants. We would be facing all of the misgivings of the old system - plus a fear of the new tool.
I created a project plan to begin capturing the next steps. The project plan will be presented back to our project sponsors, who hopefully will bless our vision for the future.
Tuesday, May 18, 2010
Business requirements - or - how I became an expert on three systems in three days
Well, not exactly an expert, but I certainly did learn a great deal very quickly.
As my manager gathered information on the system we would be replacing I was watching over her shoulder to get some advanced information on what systems we might be able to tap into.
KP has enough systems, you very rarely need to invent a new one. Chances are, someone, somewhere has bought or built a tool that would fit your need. We knew that we wanted to use an existing system, as we had no budget for licenses, setup or customization. We turned first to two HR systems - our LMS and our performance management system. We wanted to take advantage of the natural connections between training, development, coaching and performance management.
As the business requirements began to lay out, I began testing and examining our two HR systems for possible fits. I poked my head into offices of system experts, cornered system sponsors in the office hallways asking hypothetical questions. Like most folks who begin business requirements work, I thought I knew the solution to our issue. Naturally, I thought the LMS would be our best fit. I identified the "Learning Plan" functionality as the module that had the closest match. I scoped out the work, tested some scenarios in our sandbox environment, and even considered the evil of customizations of forcing the system to behave the way we needed it to.
The biggest issue lay in the fundamental mismatch in functionality. The LMS and performance management system were built for the partnership between the manager and the employee. We needed a system with two additional roles - Instructor and Coach.
What we needed was more of a community of interest - an open relationship between participants, managers, coaches and instructors. We needed web 2.0.
I had used the jive/clearspace KP-branded application "KP Ideabook" for some document sharing and to get a few answers on discussion boards. The idea was broached to try "KP Ideabook" against the business requirements, just after we had discussed web 2.0 applications and what they were capable of. After two days of intensive learning, testing and building a demo system, I was able to mark Ideabook against our other candidate systems. The results look very favorable and our small project team thought we had a hit. I mocked up a quick power point demo to present our findings and our recommendation to the project sponsors - the National Learning Leaders.
I was on PTO the day of the presentation, but received an email from my director upon my return:
"I wanted to let you know the presentation today was great. In fact, it was the only work from all the teams that we clearly have a go ahead to move forward on as outlined.
The deck was great and the slides of Ideabook screenshots were a real help.
Thanks to both of you for your excellent work. It is appreciated."
Awesome.
Monday, May 17, 2010
Charter 2.0
Because our leadership doesn't feel we have enough on our plates (what with the never-ending, ever-delayed project, and those 40 hour support jobs we had prior to the additional project work) they have decided they need a flashy cost saving project.
Several years ago - KP implemented a teaching methodology culled from The Six Disciplines of Breakthrough Learning: How to Turn Training and Development Into Business Results
The 6D's - as they are called - can be explained in the handy image
The trickiest D's are the 4th and 5th D - follow through and support. With our leadership training, particularly Facilitative Leadership we instituted a 10 week goal setting/achievement process. The process has the training participant setting goals at the end of the class and working in partnership with his/her manager, coach and instructor to track and document the progress and success of these goals. The success measurements can be boiled down to training ROI - the holy grail of development metrics.
KP bought this tool from a pricey vendor that claimed to automate and mine data from the process, creating nifty dashboards. Fort Hill was a niche company specializing in 6D methodology. The ResultsEngine was a hosted service with a low level of "hands on" support and maintenance of the program. The tool was complex, time-intensive to maintain, and the cost per user was not scalable. Moreover, the ResultsEngine, which we branded "Friday 5's" did not measure the right metrics (only the easy to capture transactional metrics, but Leadership didn't mind, they wanted "numbers".)
Eager to save "low hanging fruit" dollars, "Friday 5's" was slated for replacement. My boss and myself were given the task to propose a no-cost system/low-cost resource solution. And we were given 2 weeks to do it.
Quickly, we sprung into action. My manager began polling existing users about the Friday 5's tool: what they liked and disliked, what was critical to keep and what could slide with out too much issue. I took the growing list and created a business requirements document.
What I found, proposed and what happened to our project along the way will be in later posts. This revised charter reflects some of the surprises and solutions, so I hope I do not spoil the surprise.
Friday, May 14, 2010
Project Documentation
Charter
Business Requirements
Project Plan
First Draft Screenshots
Thursday, May 6, 2010
momma's got a brand new bag
I am now in various states of implementation in 3 projects using "KP Ideabook" - a community of interest group for the administrators of the application I support, coordinating a 6 week UAT (user acceptance testing) group of about 40 people and a six disciplines of training follow-through tool for ongoing use through out the company. The last implementation will be replacing an expensive vendor tool (fort hill results engine). I will be re-writing my charter on the six disciplines tool - as it is leader sponsored and is going through the whole charter to implementation process in the next few weeks.
Wish me luck - charter to follow shortly.
Tuesday, May 4, 2010
crazy, crazy, crazy
I am integrating the tool as a "Community of Interest" for the administrators of my system, with interior subgroups for special teams - like Report Users and User Acceptance Testers. I will use these spaces as a part of our overall deployment/communication effort. "KP Ideabook" has discussion boards, blogging, document management(including collaborative documents) and mini project plan modules. I am in a rush to get the space ready for our first push this coming Thursday, when I will demo the tool for our large user community. Our go-live date is drawing near again (Sept 1st) and it looks like this time it is for real. As one of only two people set to deploy not only a system but training on that system to 3,000 users - this will be a godsend.
As well - there is a leader sponsored project to replace a costly vendor tool with a no cost solution. Web 2.0 to the rescue. I am going to use "KP Ideabook" for this project as well - although I will set the functionality and user interface to look and behave very differently from my User Community of Interest.
This thing is going to be put in place right away - and had I not taken this class - I would have never understood the possibility of this tool. Thanks Jun!
Wednesday, April 14, 2010
I work for a really big company (made up of large sub companies~ 180k) and the app I work on has to fit a really broad range of needs and be approved by a large set of clients. In some ways, it is really interesting to work at this level - a window on the training activity of the entire enterprise. My metrics are big, my reports have loads of users and I have a slew of "super user" clients.
In some ways - because it must meet so many requirements - it waters down and dulls some of the sharpest aspects of the software. There are whole suites of tools we bought we can not use, because they would not be scalable. In our enterprise situation, processes and decisions are made all or nothing - it must suit all or can not be done.
I recently hit my ten year mark at my company. I looked over my resume and saw that the first 3 years - when I was working with a much smaller audience (6k)- I really got a lot more done than in my past 3. When I started, the department was new and processes had to be built everywhere - we were making up our structure as we went along. Now, in a more mature orginization, processes are improved, tweeked, the possiblity to implement a suite of apps like 37 signals has passed me by.
I feel envy for all of you who say your team is just getting started down this path - you have so many cool new tools to use and platforms to build that will take you to the next step. Enjoy it while you can.
Sunday, April 11, 2010
obsessed with ancestry.com
AC is a SaaS model where I pay a monthly fee to access the database records. The actual platform – the user interface and application, is secondary to the massive data that AC has digitized and made searchable. My mind boggles at the undertaking - digitizing hundreds of years of millions of records – records commonly found in county file cabinets, yellowing bits of census journals with pen and ink handscript. Optical character recognition (OCR) allows searches against images of the actual documents – I can view my grandparents signature on their marriage license.
My original goal of creating a basic family tree with the view towards getting a better idea of “where am I from” has basically been answered. (Long story short – the four corners of PA, VA, WV, OH before they were states) The more time I spend on the tree, the more engrossing the whole process becomes. I built a base of ancestors back to 1700 (as far back as I care to go). As I searched and added family data from census and other information; I came against misspellings of names, duplicate records for a single ancestor, and changing city, county names (this came into play as the commonwealths become states and the mason-dixon line created WV from VA). As anyone who has created a db from scratch will tell you – data integrity is key (no pun intended!) and an obsession of it's own. I kept an eye to what seemed the most accurate of records.
Of course, as soon as I had my data set, I wanted to play with it. I wanted to do a timeline chart, a chart by birth months, a map of residences; I wanted to see the whole tree in it's splendor. As I mentioned above, AC as an actual application is pretty light with very limited ways to view my data and no reports at all to showcase my records. I began thinking of AC as my system of record and searched for a reporting tool to feed my need for metrics. The common file type for family tree software is GEDCOM and exports as a large text file with keyed records. I couldn't easily export the file to excel or access to create a home-brewed report, so I needed to find software that would integrate and allow me to manipulate my newly-culled family data.
I have downloaded a demo of MacFamily Tree and look forward to testing out the touted reporting tools. Although I have been able to gather this hierarchical net of family data, what I am missing is the narrative – the family stories that truly populate a family tree. Luckily – it looks like I might be able to find some family stories via google books. Isn't technology grand?!
As a note to myself - future topics for posts:
CMM, PCMM, and HRO
http://en.wikipedia.org/wiki/Capability_Maturity_Model
http://en.wikipedia.org/wiki/People_Capability_Maturity_Model
http://en.wikipedia.org/wiki/High_reliability_organization
Favorite fruit/what feeds employee engagement
Audience size is everything, integration
Hosted/Internal - service and support
Thursday, April 8, 2010
Charter - attempt #1
I really struggled with this charter - I was not sure where the line was between a proposal and a solution. Here is my first attempt.
Charter - LMS Catalog Cleanup
Problem Statement
The number one complaint against the Learning Management System is the course catalog is difficult to search. If learners are unable to locate training they will either open a help desk ticket or neglect to enroll and complete training. This compliant has escalated to the highest level of leadership. This issue is a major cause of mistrust/lack of buy-in of the LMS. (where can I find stats on time lost?)
Problem Details
The learning management system is enterprise wide and administered by regional training departments. The LMS is like a library in which any author can place his/her books on the shelves. The catalog search is made against course information, so clean and correctly tagged courses are key to a successful search. Administrators wish for their courses to be found - but are not always successful in understanding how to code the information correctly.
Customers
1. Learners
2. LMS Administrators
3. LMS Support Team
4. Learning Leadership
Business Case
(system agnostic)
(is this where I propose a solution?)
Create a series of course information audits (such as category, keyword, availability, course name, course ID, abstract, sponsoring dept, duration, domain, ce's, cancellation policy)
Trend existing course information to create a controlled vocabulary for keyword and category tags. Create a living taxonomy for course information.
Archive fallow courses.
Begin using existing description field for metatagging.
In Scope
Create a new set of best practices for creating courses. Changing policies, processes, training and job aids around course creation. Creating audit process to ensure practices are followed. (Is this too close to a solution?)
Out of Scope
Addressing the search functionality of the LMS. Saba owns the search functionality of the LMS. We can only address data/record issues.
Success Metrics
Reduced help desk tickets
Improved perception of LMS
(clean course information audits?)
Deliverables
Communication Plan
Communication of new policies, job aids, training and best practices
Training Plan
Revised training for administrators and learners
Resources
Functional/Technical
Enterprise Learning Services
Sponsors
National Learning Leaders
Core Team
Thursday, April 1, 2010
Owners, sponsors and resposibilities
I personally think that these role and responsibilities will vary – depending on internal vs vendor systems, the size of the departments involved and the overall size of the system impact. A small business department working with a vendor created and implemented system will be quite different from a large business department working with a small internal IT department. If the HRIS system is a small competency management for a specialized workforce - the business might own a larger portion of the system than, say, a compensation system for an entire company. The business side (which I am on) seems to have differing levels of sponsorship; an executive who funded the work and gets to announce the success, department owner(s) whose team(s) is/are responsible for the planning, management, support and user acceptance.
Which brought up my question – to what level should a business sponsor understand how the system works? Beyond understanding the requirements they have and providing input on the user interface – what responsibilities does a business sponsor have to understand how the system performs and what it is made of? Is it fair to require an HR executive to have a working understanding of the HRIS?
Personally – I believe that business sponsors have a responsibility to understand the basic building blocks of the system. As public spokespersons for the system, project and results, sponsors should be able to answer a certain level of technical questions. As the “face” of the system to the rest of the company, communicating the project they will undoubtedly be asked questions by other business partners. I do not think they have the luxury of seeing the system as a black box in which someone makes magic happen.
When I took my first HR essentials class, I was surprised to find how very un-technical the HR world is perceived as being. My instructor did not know what a learning management system was (although she used one to deliver the online class) and the research I did for the HR scorecard paper all pointed to a disconnect from the HR Professionals to the data/metrics required to effectively manage a workforce. Is this your perception as well? As metric driven as companies are today – I assumed that HR, as a strategic business partner, would be using tools to bring them out of administrivia and into a guiding partner for the business. What do you think?
Monday, March 29, 2010
A bit about me.
I have worked in Human Resources for the past ten years for the huge not-for-profit company Kaiser Permanente. I started in 2000 working for the newly built IT arm of KP (~ 6k employees). We were just implementing People Soft, but had no other major HRIS system. After I had been working there a year – I built a base little .mdb to track hires and recruiter productivity. I built it from paper files from 1 year’s worth of hires. Since then, I have been interested in HRIS systems. Metrics appealed to me; I liked the way you could prove certain points and see trends with the hire data, although this new power did not make me popular with the recruiters. Before too long, we implemented an Icarian recruiting system and I was the regional IT admin. It automated a whole slew of functions: from job posting to hire letter generation. It allowed us to focus on recruiting quality hires in a tough market.
It was also in this position that I learned the maxim – “what gets measured, gets managed”. The length of the hiring/start process had been a major pain point with managers. We tracked a time-to-fill metric that began with the job posting and ended with an offer letter acceptance. It was not long before recruiters were gaming the system to get ever shorter time-to-fill measurements that had nothing to do with the real process. As well – fast hires were tracked – quality hires were not- so fast hires ruled the priority of the recruiters.
After a while and many other projects (including one to handle RIF benefits and documentation) I moved from recruiting to workforce development and began to focus more on performance management, competencies and proficiencies. We did not yet have a PM tool , so I built - again- a small base .mdb to house the hierarchical, job family, compensation and performance data. This was very fun and rewarding– but it did require about 5,000 records being entered in by hand from paper evaluations. This allowed us to do our first set of performance curves and calibration (ensuring – one manager’s 3 rating was the same as another manager’s 3 rating).
I began to work on workforce metrics, blending data from separate systems to create dashboards and scorecards. This project was the most interesting but also the trickiest. You have to challenge every assumption and trend you thought you saw to make certain you were drawing the right pictures from the right data. I began working with an old LMS (Pathlore) to assist with the employee career development aspect of workforce development.
As everything was humming along a large layoff occurred in our department, whittling 24 employees to 2. The two of us left had to pick up every discarded role and responsibility; which led me to become the system admin for the LMS. In 2003, KP rolled out an enterprise wide (180k+ learners) Saba based LMS and I migrated our data from the small crippled LMS to a larger, stronger LMS. It automated a whole new level of tasks that had to be previously performed by hand. I still worked on workforce metrics – but now they focused on ROI for training, cost per learner and behavioral changes. These metrics were the hardest to get to – for hundreds of reasons, and were not always able to be tied to actual training effectiveness.
A position opened up in the team that managed the Saba LMS at the enterprise level and I moved over to become part trainer, part reports person, part system super user. We are in the middle of a long delayed upgrade, of both hardware and software. I have learned a great deal around roles and privileges and client based reporting. I developed (but not built) 20 new crystal reports for our new environment – whenever that arrives.
Anyhow – this is a long rambling introduction post. Hopefully it helps you to understand where my challenges are and what experiences I am pulling from.