The following is a geoscientist’s summary of the multi-discipline professional development sessions offered at the APEGGA Annual Conference in Edmonton in April 2004. (The 2005 Conference will be in Calgary). Over 220 people attended the six different 2-day streams of sessions – under the common theme “Making the Circle Stronger”. Many participants attended the APEGGA Summit Awards on the evening of the second day. John Boyd, P.Geoph., CSEG President in 1986, received the L.C. Charlesworth Professional Service Award; Dr. Elizabeth Canon, P.Eng., an organizer of an afternoon Conference session, received the Frank Spragins Technical Award.
I take professional development seriously. My logbook is more-or-less up to date and whenever I get the chance to attend a CSPG, CSEG or APEGGA luncheon or brown bag function, I’m there. I confess I am also a course junkie.
Lately, my interest in course work widened. I became a part-time trainer for the Petroleum Industry Training Service – which provides a new context to evaluate course and trainer performance. It was with this new-found viewpoint that I found myself northbound on Highway 2 from Calgary to that geotechnical engineering marvel known as the Shaw Conference Centre in Edmonton on April 21 to attend four PD sessions and the APEGGA Summit Awards Gala. Structured on the theme “Making the Circle Stronger”, my selection of the session topics promised an overview of several growth technologies with connections to resource development.
Day 1 — morning session: “Exploration and Development of Bitumen” presented by Mike Ranger. Mike is well known in the industry for his enthusiasm and knowledge of the Athabasca deposits and provided a flood of information over the next four hours.
The presentation was divided into two parts: The Big Picture and The Details. Both portions were scaled to the non-geoscience crowd, using a minimum of jargon and well-selected photographs to illustrate key points. Your humble scribe was glad for this, as the last time I had any contact with oil sands geology was in the late 1980’s. Much has been learned in the intervening 15 years!
Mike walked us through the key elements of oil sands deposit architecture: the Prairie Salt subcrop, the pre-Cretaceous unconformity and its topography, the stratigraphy of the Clearwater and McMurray formations, and the overlying drift geometry. He then showed how each element affects the quality of the bitumen reservoirs within the Athabasca basin.
Having established a clear frame of reference, Mike then wove in the details and specifics of the Wabasca deposit and one sub-basin of the Athabasca deposit, the Hangingstone.
The Wabasca basin is still relatively undisturbed, in equal parts due to its remote location and its smallish size compared to the Peace River and Athabasca deposits. The Hangingstone is another story. In 1986 there were 43 wells in the sub-basin. In 2003 there were 245. The 202 new holes were not for bitumen development purposes however. They were drilled for gas production and this information sparked a lively discussion about the recent treatment of gas producers in the oil sands area.
Day 1 — afternoon session: “Coalbed Methane and Water Resources”. The first of three presenters was Michael Gatens of MGV Energy, with the topic “Introduction and History of NGC/CBM Development” where NGC/CBM is Natural Gas from Coal/Coal Bed Methane. Michael made two distinctly different presentations: a discussion of the structure of the industry through its association, the Canadian Society for Unconventional Gas, and an outline of both the technology for coal-sourced methane extraction and the issues surrounding the technology.
To those outside the unconventional gas industry, Michael’s talk presented the issues in a remarkably non-partisan way that stimulated both discussion and fresh thinking about where this comparatively new source of energy is going. Particularly interesting was Michael’s discussion about why the environmental damage resulting from uncontrolled formation water release in the Powder River Basin in Montana would not happen in Alberta. First, Alberta’s regulatory environment is radically different from that in the Mid-Western United States and secondly, different basins have different properties and challenges. Getting that concept across to the public is clearly a tough assignment for the industry.
The second segment of the session featured Mary Griffiths of the Pembina Institute presenting “Stakeholder Issues and Water in Natural Gas in Coal/Coalbed Methane Development in Alberta”. Whereas Michael Gatens’ talk promoted the views of the industry, Mary Griffiths’ role was to counter with public policy and safety concerns.
She began by explaining the role and mandate of the Pembina Institute as a group of people interested in “holistic and practical solutions for a sustainable world”. The Institute sees itself attempting to link together sustainable development from “ Technical and Management Solutions”, public policy, and education.
Mary’s presentation was based on a set of recommendations from a report prepared by the Institute in 2003 that suggested: (1) formal regulations should require the use of non-toxic substances for hydraulic fracturing in non-saline geologic zones, (2) prevention of negative environmental impacts from dewatering non-saline aquifers, (3) an optimum method of use or disposal of different grades of non-saline waters, (4) performance of a cumulative environmental impact assessment of large projects, and (5) strengthening the role of Alberta Environment in management of non-saline produced water.
Bill Gunter from the Alberta Research Council had the final paper of the day, “Potential Impacts of Future Technologies on CBM”.
The criteria for CBM utilization from a particular coal seam include permeability greater than .1 mD and adequate porosity to hold gas, water or a combination of the two. The enhanced extraction methods for recovery of methane involve withdrawing water from the coal or flushing water and methane with some other gas, such as nitrogen, carbon dioxide or power station flue gas. All these alternatives have been explored and all work well.
H2S stimulation is another enhanced CBM alternative Bill described in some detail. 42 projects are in operation that use unmineable coal seams for sequestering H2S from gas plants. CH4 is harvested and introduced into the plant output. Bill also talked about using bacterial decomposition of coal to produce CH4.
A panel discussion on NGC/CBM opportunities and problems concluded the day. At the industry level, it is clear that Alberta’s Horseshoe Canyon coals will be exploited first, followed by the deeper Manville coals. The young Ardley coals will likely be developed last, as a series of technological breakthroughs are needed before economic CH4 extraction is feasible. It is also clear that commingling coalbed gas with interbedded sandstone gas production will be necessary to make total zone production feasible.
Probably the most important barrier to widespread acceptance of NGC/CBM resource exploitation will be the general mistrust felt by the public toward the oil and gas industry in general. Both Michael Gatens and Mary Griffiths agreed the single common thread through all public meetings and hearings to date relate to past industry practices and to the Powder River Basin CBM operations. A major effort to gain public confidence in the Canadian industry is clearly essential.
Day 2 — morning session: Rick Brommeland from the National Institute of Nanotechnology in Edmonton moderated “Nanotechnology: Awareness”. The leadoff speaker for the morning session was Dr. Steven Dew, Department of Electrical and Computer Engineering at the University of Alberta. He drew the difficult task of setting the stage for the following speakers with a paper called simply “Nanotechnology”.
Steven began by defining Nanotechnology. For the uninitiated (me), nanotechnology deals with engineering processes with long dimensions less than 1x 10-9 m. Steven’s definition, a quote from Albert Franks, was: “that area of science and technology where dimensions and tolerances in the range of 0.1nm to 100nm play a critical role.”
The origins of nanotechnology can be traced to Richard Feynman’s seminal 1959 paper “There’s plenty of room at the bottom” in which he called for research into reducing the size of machines and data storage devices. He got people thinking, although at that time there simply weren’t any tools available for reaching into the nano-regime.
Eric Drexler coined the term “nanotechnology” in 1986. Drexler foresaw the potential for machines capable of assembling devices at the molecular level. He also raised ethical concerns about self-replicating nanodevices getting out of control and evolving into an unpredictable form he referred to as “grey goo”.
Steven showed us a table of objects with their nanometer dimensional equivalents to get us thinking about relative sizes. For instance, an average person is 1.7 billion nm high; a hair is 75,000 nm in diameter; a virus is in the range of 75-100 nm; a transistor junction is 90 nm; DNA is 1 – 2 nm in length and a single atom is in the order of 0.1 nm in diameter.
Some of the tools available for this sort of work include self-assembly, electron beam lithography, scanning probe electron microscopy, which can manipulate individual atoms, and zeolite minerals that can trap or sieve single molecules of a defined size.
Research into nanostructured materials is progressing rapidly for such applications as nanofiltering, gas sensor substrate and templates for nanowires. An interesting development is the extraordinarily hard nanocrystalline materials.
Why would anyone want to build things at such an incredibly small scale? First of all, in an important sense, nanotechnology is nothing new. The petrochemical industry has been using catalysis, which is manipulating individual molecules on a megagram scale, for nearly 100 years.
Dr. Murray Gray from the Department of Chemical Engineering at the University of Alberta discussed current uses of nanotechnology within industry. Those applications include ion exchange columns for water treatment, adsorption techniques and crystalline molecular sieves synthesized from SiO2, and utilization of titanium and aluminum for low cost, elevated temperature gas separation purposes. Hydro treating with nanoscale (5 nm) molybdenum islands on ceramic supports provides high molecule selectivity applications. Murray reminded us that bitumen upgrading is essentially a self-assembly operation, as the light ends that result from hydrotreating require no outside manipulation to achieve chemical stability. He also told us about a number of other self-assembly operations that will soon become commonplace such as the production of nanotubes and “buckyballs” (buckminsterfullerite) from poly aromatic hydrocarbons for very strong, light engineering materials.
Next, Eric Vignola of Nova Chemicals treated us to an insight into “Polymer Nanocomposites Based on Smectite Clays: Preparation, Properties and Potential Applications”. Now, those of us from a geoscience background have a feeling for the native properties of smectites that does not include any notion of polymerization. We know active sites exist on clay platelet surfaces and usually think of them as sites for ion exchange activity. The idea of plugging an oppositely charged polymer thread end into those nanoscale active sites for purposes of creating a solid is pretty foreign.
Granted, those of us who have had experience with flocculants have used small amounts of polymers to initiate flocculation, but the idea of large scale binding to create solid materials for such diverse uses as auto parts, impact protection packaging, household furniture and appliances, gas barriers and fire-retarding building materials is simply astonishing.
The last speaker on nanotechnology was John McRory, the acting vice-president of research for TR Labs in Calgary. In “MEMS and Nanostructures,” John led us through a series of research efforts at TR Labs that were not strictly Nanostructures, since the long dimension of some of the Micro- Electric-Mechanical-Systems (MEMS) structures were in the 1 X 10-6 range (microstructures). These projects helped the lab to develop expertise, which will ultimately achieve strictly nanostructure dimensions while producing useful products at the same time.
TR Labs is primarily concerned with telecommunications and specifically with the RF portion of the spectrum (<1 THz and >100 kHz). The main line of enquiry is directed to moving away from analog devices and into the realm of purely digital radio electronics. At the moment, both RF and digital devices cannot coexist on the same substrate, as they interfere with one another. Much of the research in progress is working toward noise abatement and noise immunity in hybrid analog and digital devices.
Physical size of inductors is one significant limiting factor in reducing RF circuits from micro to nanoscale. Heat is another. John gave an example of how a heat problem is used to advantage: a variable inductor, used for tuning an RF circuit, was developed using a bimetal strip with sub-millimetre dimensions which changed capacitance by passing a dc current through it, causing it to heat and curl away from the lower potential electrode.
John expects true nanoscale RF devices to appear over the next decade in response to the telecommunications industry need for smaller, lower power and higher efficiency devices.
Day 2 — afternoon session: “Geomatics in Alberta: Present and Future”, involved four papefDayrs from the Department of Geomatics Engineering at the University of Geomatics Engineering is rooted in the art of surveying engineering, wherein much of the early research was conducted. In fact, the Geomatics Department arose from the Department of Surveying Engineering within Civil Engineering at the University of Calgary in the early 1980’s under Dr. Ed Krakiewski. It deals with acquisition, modeling, analysis, representation and management of geospatial data.
Dr. Gerard Lachapelle’s paper on “Satellite-based Positioning and Wireless Location” provided basic information on Global Positioning System (GPS) technology. While the use of GPS instruments has exploded in recent years, the basic concepts behind the numbers on the receive read-out, the sources of error and limits of performance are poorly understood. Gerard described in simple terms how to interpret readings and what to expect in terms of accuracy from the three main classes of instruments available today. He then described the differential GPS and Real Time Kinematics (RTK) sub-metre positioning technologies also in simple terms. Most of the audience was unaware these technologies exist and it was a revelation to me.
Gerard concluded his talk with the information that GPS version 2 (GPS2) will begin installation this year and will take seven years to complete. A new system named Galileo will be ready in between 3 and 5 years, with 16 to 18 satellites visible at any given time, which will enhance system reliability considerably.
Dr. Nasser El-Sheimy introduced participants to “Mobile Mapping Systems”, which have been the focus for R&D at the Department for a number of years. Examples of applications already in the field for this application of RTK technology include municipal infrastructure mapping using digital video data acquisition; realtime identification and location of forest fire hotspots using Differential GPS (DGPS); LIDAR (Light Detection and Ranging) airborne surveys; and pipeline mapping using the GeoPig System. Current research includes work on integration of a lowcost attitude sensor in mobile systems to further improve mapping accuracy.
The final paper in the series was presented by Cathy Valeo. The paper was entitled “Environmental Applications of Geomatics Engineering” and it dealt with using remote sensing techniques integrated with Geographic Information Systems (GIS) technology to monitor environmental conditions in near-real time.
Multispectral imaging from satellite platforms is readily available from internet sources, some of which are free. Cathy gave us a set of URL’s where imagery is available. She then described how GIS (which can also stand for “Geospatial Information Systems”) can be populated with both imagery and ground truth data to produce accurate thematic maps for environmental monitoring purposes. She also made the point that without the sophisticated data management capabilities of a modern GIS, many data management tasks would be impractical, such as data conversion; parameter extraction and interactive mapping.
Another application of GIS is providing data for numerical modeling purposes. Environmental models can be as simple as a set of parametric statistics displayed using standard charts from a spreadsheet or as complex as single-purpose, custom programmed predictive models that are part of a fully integrated package. Cathy walked us through a simple example of a soil erosion problem using a simple “bridged” parametric statistical model, a more complex “embedded” model that subdivided a watershed into its component parts using subprograms written in Visual Basic for Applications and a sophisticated “assimilated” approach to watershed three dimensional simulation modeling using a specialized modeling language called PCRaster, a public domain tool available for free downloading from the internet.
The four Geomatics talks were extremely informative. By chance, I am in the early stages of developing a multidimensional field program that will require using GPS and GIS technology. The insights I gained from the workshop will make my coordination task much easier as I now have some background information, and an important set of contacts to smooth the way.
In summary, I found the workshops and seminars offered for Professional Development at the APEGGA Annual Conference to be extremely interesting and helpful. The contacts made with the presenters and with other participants during the coffee breaks, lunch breaks and the wrap-up Gala will be very useful in both a business and professional sense during the coming year. In the normal course of business, there are some disciplines that an independent professional would never encounter. Cross-fertilization of ideas from other branches of earth sciences and engineering is partly what APEGGA Professional Development functions are all about.
About the Author(s)
Tom Sneddon has been a member of CSPG for many years and volunteers at many APEGGA functions. He started his professional career in geophysics with AMOCO in 1966, where he joined the CSEG, but was led astray into geology in the 1980’s where he has been ever since. Tom is also a part-time trainer for the Petroleum Industry Training Service.