Wednesday, February 26, 2014

The Recruiters

On 022614 I received a bunch of calls from recruiters between 9 in the morning and 6 at night.  While two were for other jobs in the tech/mapping industry, eight of them were for a job that I had heard about for months.  It was variously described to me as an architect/analyst job and they all provided me with the same description (see below).  It was for client job order/requisition 6020, an IT GIS Specialist/Analyst.  Most of them identified the client as PG&E at their location in downtown San Francisco.  The 6 month contract was 40 hours a week and the pay rate was more than $70 an hour on a W2.

Each of the recruiters had found my resume on Career Builder or Monster.  However, many of these recruiters had actually saved my resume from these websites in the last year.  Each one of them sent both an email and made as many as two calls and in every case they wanted a “right to represent” form from me in addition to my newest resume and a competed application.  They often wanted my availability, availability and an explanation of any “gaps” in my resume.

The thing that has always struck me about these postings and recruiters is how uncoordinated they are.  They generally seem to be four or five person teams with a sheaf of announcements that they must fill.  I have noticed that often the same people call me for months at a time and then they are replaced by a new set of recruiters or recruitment groups.  While many of these agencies do seem to disappear over time, I have also found that there is a subset that persists.  They have a well-organized staff and require that each of their candidates go through an introductory interview before receiving new opportunities.  At this point it would seem that these groups are taking a greater stake in the projects for which they are recruiting and bear a closer to firms rather than recruiters.  However, I am reluctant to include them in my lists for firms because the manner through which I have learned about them is via the recruitment process.
  
These were the qualifications directly from the announcements:
The ideal candidate should have experience with Experience with ArcFM AutoUpdaters and ArcFM configuration, Demonstrated ability to support business end users and production issues. Should have good understanding in data management in a relational database; 5-7 years of GIS Analyst experience utilizing the principles and practices of GIS is required. - Knowledge of multi-faceted disciplines which contribute to the implementation of the GIS application. - Some experience and exposure in GIS application development utilizing (asp.net, C#, Microsoft Silverlight, Adobe Flex) - Knowledge of Structured Query Language (SQL) scripting language to retrieve data from relational database management systems in MS SQL Server; - Familiarity with ArcObjects/VBA/Python scripting and programming - Considerable knowledge and experience with the core ESRI GIS software products, specifically ArcGIS Desktop, Arc/Info, ArcSDE, and relational databases within MS SQL Server. - Knowledge of methods, procedure

These were the responsibilities from the announcements:
Staff members in this labor category provide database development support in creating cartographic and digital data products. These staff members have expertise that includes the performance of hard copy to digital data conversion tasks, data migration, and translation activities utilizing advanced processing techniques in ArcGIS. These individuals design, develop, and implement efficient production tools and workflows in accordance with approved project plans and design parameters.

Wednesday, February 5, 2014

Phone call with Marcus

I spoke to Ravi at Collabera and we confirmed my phone call with Marcus at 930.  When called the number I ended up on a conference call with Marcus, Steve, Gordon and Vikram.  Steve apparently was an associate that I remember speaking with before.

I wrote that the team I had applied for a job with did support for the development of GIS software.  Marcus asked some questions for the team saying that the project I was applying for was phase two of the project and that they had a query tool that created tables, used GRASS and identified buffered areas.  He called it an ETl tool that does data visualization, where if the data doesn’t pass inspection it may not be used.

I described some of my previous experience and then Steve explained that Phase Two would be provide further integration with existing software.  He called it regression testing to see if data was displayed correctly and then described how the day to day activity was meant to query the db and verify that it make the correct outputs and if not then we would have to make the new data.  The team then explained that they were looking at what changes affect the front end tool correctly, saying that Gordon defines the requirements in liaison with the group and that everyone has to meet with him occasionally.  They said there were also general requests from other teams as well.

I told Marcus I wasn’t certain what a Feature Manipulation Engine (FME) was.  However, once they explained that it was a buffer tool which turned routes into events I told them that I had experience with tools like this.  They said it was an ETL tool and that was crucial to interface with the FME developer to understand what manual changes were necessary.

Marcus indicated that I needed to know about shp and geodatabase exports but Steven explained that a lot of that was peripheral.  Marcus also asked me what I had done in the GIS area.  Here I explained that my GIS background covered many years and that I was quite capable with the environment that they had described.  I also indicated that the specific types of buffering and manipulation were like second nature to me especially when it comes to tracking changes and testing.

I’m competent at data export/import and that once the tool is made then a lot of the work was already done.  I also noted that scripts need to be watched every time the process changes—especially if you make a new data source.  Steve then said we wouldn’t be building anything but that interfacing with the team was important.  He also said that I would be checking the source of the data and making certain that the front end tool was correct.

Vikram then went on to say that the process was just as complicated as my last position with the company.  He then asked me to explain how my previous experience coincided with what they explained.  I explained the process of moving data from sharepoint to access to the engineers and finally to the uploaders group.  I made certain to name all the principle actors as a reference as well.

I told them that the data Ihad been using didn’t use shapefiles and explained the lag between each teams procedures.  I said that we used Access to query and then send data to the engineers and that I also knew how to upload data.  I described the process of loading and then sending the data to the testers and went on to explain our “Linear Match” tool for addressing buffering issues.

Steve went on to say that they use Model Builder and I said that I had experience with it.  He then said that they used a testing model for testing which only needed my feedback.  I explained that a model is easy to understand if you have access to the source data to see what is going on with it.

I said that from the description I understood the job well.  I then confirmed the location and asked how many people were on the team and sounded like there were around 10 with 4 developers and a couple of testers.

Within an hour Ravi called to tell me I had landed the job and that the HR guy would call me.  He called a half hour later to confirm that I had received the job.  He then asked if I would accept it.  He said that the contract was 6 months with the possibility of more.  He then confirmed my compensation and said he would send me the details.