One of the "problems" we have right now with this new viewer is that although we have got most of the foundational stuff right, the UI lacks stylistic cohesion. This is due to the viewer being a mish-mash of various react components, each with their own unique styling quirks. I really wanted a react-based UI toolkit that had a good enough baseline set of widgets/components with a unified look for building desktop-centric web applications (with an option to go mobile down the road).
Well, I think my wish for such a toolkit has now been fulfilled, and its name is Blueprint.
What sold me on this particular UI toolkit was:
- It is React-based
- It has a good wide range of components that covers most of what I need to port across the remaining Fusion templates
- It comes with a diverse set of icon fonts
- It has a friendly license (BSD)
- It is written in TypeScript and the library comes bundled with TypeScript definitions
- It looks good!
We have better looking modal dialogs (as seen in our updated Aqua template)
We have better styled UI for certain tools
And we have key components needed to start bringing across the other remaining Fusion templates, like the TurquoiseYellow template.
Compare the original Fusion template
With our blueprint-powered version
Looks good enough doesn't it?
Now there is one executive decision I'm making with the Fusion templates I'm porting over. The overview map will always be present as a toggle-able button on the main map viewer component and not outside of the map viewport.
Part of the problem is having the OpenLayers OverviewMap control render its content outside of the map viewport doesn't really play nice with React component updates in my attempts thus far, so I've taken the creative decision to not bother trying to get everything 1:1 when porting these templates over. As long as the main elements and styles are there, it's good enough for me.
Now sadly, despite having made several existing libraries and React components redundant (and they have been removed as a result), taking on Blueprint has added a lot of extra weight to our final bundle
Fortunately, this is still significantly under the current Fusion production bundle size, and as we are still in the process of reaching functional parity with our existing AJAX/Fusion viewers, bundle optimization is not a priority at the moment. When that time comes, we can look at things like custom OpenLayers build profiles and moving to Webpack 2 for its tree shaking feature, which should make some in-roads in cutting down our production bundle size to more acceptable levels.
gvSIG Team: gvSIG Batoví, article about GIS applied to educational environments based on gvSIG Desktop
The online publication specializing in Geographical Information Technologies Directions Magazine has published an article highlighting gvSIG Batoví as an example of cooperative planning and creative partnership.
Filed under: english, gvSIG Desktop, gvSIG Educa Tagged: Education, gvSIG Batoví, uruguay
Last week Andrea Antonello (HydroloGIS) shared a very interesting documentation related to a course about geographic scripting (python) with gvSIG Desktop. The course was done as part of a Master of the University of Potsdam.
The documentation includes:
- The scripting composer and python
- Geographic scripting
- Raster Data
- From Geo into your report
You can find the documentation linked in the original post:
Filed under: english, gvSIG Desktop, scripting, training Tagged: documentation, python
I have found it extremely difficult to extract information about NRPP from the government. FOIs have come with very large fee assessments, or documents have been completely redacted – you’d think I was out to get them or something.
New flash: I am not out to get them. In fact, they are the best-run IT mega-project I’ve seen so far in the BC government. But that doesn’t change the fact that, like the dinosaurs before them, they are doomed, dooooomed.
Fortunately, the FOI process can be indiscriminate, information can leak out despite the best efforts of the project team, and last month there was great tidbit about NRPP.
Hiding inside an FOI request to the Environmental Assessment Offices (EAO) were a number of interesting documents.Who’s On First?
define the meaning of ‘sector transformation’ by documenting a model that will enable its implementation.
Or put another way, after three years and $50M, the “Natural Resources Sector Transformation Secretariat” (NRSTS) still isn’t clear on what “transformation” means, and is looking for guidance.
It’s not too hard to figure out what’s going on.
Lacking any operational mandate themselves, NRSTS has been going around looking for partners with operational permitting processes: “come work with us, we’ll re-structure your business work-flow and make it ‘better’ with cool software we haven’t built yet”.
To me, this sounds good (cool software! better!); to NRSTS, this sounds good (re-structure!); to the permitters it sounds like “we’re going to waste a bunch of your (scarce) time in workshops making you talk about abstractions instead of doing your job, then we’re going to upend your office in the service of ‘transformation’, while we experiment with software that is as yet unwritten”.
You know what happened to the guy who had the first human heart transplant? In IT terms, the procedure was a success – he didn’t die on the table, he died 18 days later of pneumonia, with his heart still pumping away.Back to High School
After asking for advice on ‘transformation’ from the ADMs, Bangert then tells them he needs a “subject matter expert” (at a “decision making level”!) from each of them to attend “several” workshops in the summer and a week-long workshop in the fall. So, presumably a manager or director, whose time is sufficently low value that it can be donated to NRSTS for days at a time.
But that’s not even the best part. The best part is the structure of that five-day fall workshop. NRPP is going to be running (has already run?) a “Model UN” process with all these managers and directors.
My favourite bullet points! From the “How Will it Work” section:
- An unlimited number of delegates are allowed per Ministry. Attendance and pre-work completion is mandatory before and for the duration of the workshop.
Because more is better, and mandatory homework makes fast friends! From “Who Should Attend”
- Folks who are highly motivated to make the Natural Resources Sector “processes” work better.
- Extroverted communicators and people connectors.
- Introverted thought leaders and thinkers.
- Creative problem solvers.
Great combination! Anyone not invited?
Still, so far we’re just talking about a standard “consultant-facilitated workshop time vortex”, of a sort we’ve all participated in and/or inflicted on others. The bit that is really transcendent is the “engagement model UN process”:
- There is a general assembly component
- Only voting delegates attend
- Voting on resolutions prepared by committees
- Decision making body for the process
- Mandatory that voting delegate attends
- Fixed time for debate and voting
- A chairperson oversees
- Process repeats [emphasis added] until all aspects of the work flow have been reviewed, resolution prepared, and voted on.
How could this possibly go wrong?!?The Smell of Desperation
Once again, inputs:
- Three years,
- $50,000,000 and counting, and
- A whole Natural Resources Sector Transformation Secretariat.
- “What exactly do you mean by ‘transformation’, really?”
- “Lend us your SME’s for a week, so we can figure out a generic process to stuff your business into.”
- Also, some unimpressive deliverables.
If a core early problem with NRSTS was that nobody wanted to be the first organization to be subjected to their tender mercies, imagine how they are perceived now, as they come up on the end of their Phase 1 funding and still haven’t even figured out what “transformation” means?
Would you trust your staff time and business process to an organization that looks likely to be blown up in the next 24 months? If so, why?
Addendum: Commenters, please weigh in on whether the recent departure of the Executive Director, Technology to work with major project consultant CGI is (a) a sign of good things to come (CGI positioning to win follow-on work in Phase Two) or (b) a sign of imminent disaster (man-in-the-know getting out while the getting is good).
Os dejo aquí el texto que he utilizado en mi participación de la apertura de las 12as Jornadas Internacionales de gvSIG (Podéis seguir toda la actividad en twitter con el hashtag #12gvsig).
Buenos días a todas y todos los presentes. Me gustaría comenzar por agradecer el esfuerzo de toda la gente que nos ha traído hasta aquí, a nuestras 12as Jornadas Internacionales de gvSIG. 12 años. Se dice pronto para un proyecto de software libre nacido en la periferia de Europa y que hoy día es conocido en todo el mundo: un simple dato, la última versión de gvSIG Desktop ha sido descargada en más de 160 países. Y quiero empezar agradeciendo especialmente el trabajo del equipo de la Asociación gvSIG. Soy tremendamente afortunado de poder trabajar con gente que tanto da a un proyecto común como Joaquín, Mario, Óscar o Manuel. 3 de ellos además se formaron en esta escuela, como también es mi caso. Por ese motivo creo que estas son unas jornadas especiales para nosotros, volvemos a nuestra casa, de la que quizá nunca nos fuimos.
En estos tiempos donde parece que todo debe ser mercancía, donde con todo se especula, incluso con el conocimiento…hablar de geomática libre, de un proyecto solidario y colaborativo en una universidad pública tiene todo su sentido. Lo público relacionado con el acceso al conocimiento. Si el saber no es accesible a todos estaremos creando sociedades más injustas. Igualmente si la tecnología no es un bien común.
Por eso, en este marco, ánimo a los profesores asistentes a impulsar en las aulas el software libre y a los alumnos a exigirlo.
Recuerdo que hace unos años, en otras jornadas de gvSIG, un profesor (no de esta escuela) preguntó “¿Y yo que gano con impartir mis clases con software libre y no con privativo?”, remarcando a continuación lo que a mi modo de ver es una falacia “Es lo que se demanda”, y más aún viendo la evolución e implantación de la geomática libre desde entonces. Pero yo me pregunto ¿Y qué se demanda?¿Qué demanda hoy día nuestra sociedad?¿Que los alumnos sepan utilizar una determinada marca de software de una determinada empresa, con frecuencia externa al país? No caigamos en ese engaño. No, lo que demanda la sociedad es que los futuros ingenieros, arquitectos, licenciados…construyan una sociedad mejor, más democrática, más igualitaria, más colaborativa, que se sumen esfuerzos para ello.
Estamos en unas jornadas de gvSIG con el lema “Conoce el territorio, Gestiona la realidad”. La Geomática, el geoposicionamiento de la información y su gestión se ha convertido en una pieza fundamental. Nos estamos dando cuenta de que sin conocer la relación espacial de la información, lo que nos rodea, no podemos gestionar eficientemente. Que modernizar la gestión pasa indudablemente por integrar la geomática con el resto de sistemas de información.
Y que esa modernización pasa por apostar por tecnologías de uso universal, con libertad para su adaptación y mejora, sin restricciones salvo la de preservar nuestra libertad. Que lo de libre o privativo, es un adjetivo que unicamente hace mención a las condiciones de explotación del software, no a su calidad. Que el software privativo perpetua la dependencia hacia entidades privadas y que el libre puede convertirse en un motor económico que impulse un tejido industrial altamente cualificado. Y que la responsabilidad de que así sea recae tanto en las empresas como en nuestras administraciones públicas y universidades.
Quiero acabar con una frase de Antonio Machado que resume muy bien de lo que estamos hablando “En cuestiones de cultura y de saber, sólo se pierde lo que se guarda; sólo se gana lo que se da”. Muchas gracias y bienvenidos a las 12as Jornadas Internacionales de gvSIG.
Filed under: events, opinion, software libre, spanish Tagged: 12as Jornadas Internacionales gvSIG, apertura
Do you need to get from your clients or friends gigabytes of data? We do. People send us large maps and geodata, often in tens or even hundreds of gigabytes. Now they can do that directly from a web browser, by drag&drop!
To make this possible, we’ve built a pretty cool web application named Drive Uploader which works with Google Drive™cloud storage.
Check it out at https://driveuploader.com
The uploader you create looks very familiar and you can customize the design, if you need to.
We’ve built the Drive Uploader to easily receive large files (documents, pictures, video clips, aerial photos, etc.) from customers, suppliers, colleagues, family or friends.
The sender does not need to create any account, they just visit a website and use their web browser.
The main advantages of the service are:
- No limit on size of the files
- No need to have a Google account for the senders
- Extremely easy to use and integrate with websites
- Security of the transfer (Drive Uploader uses HTTPS and private storage)
- Files are saved directly in your Google Drive™ in the folder you choose
Embedding in websites, branding with your logo and integration to your products is possible via API and webhooks. These advanced features are available in a paid plan, together with a removal of ads.
Every Gmail account comes with free 15 GB Drive storage and affordable upgrade plans. The business and education G Suite / Google Apps account come with unlimited cloud storage. It is time to use this storage!
So, once more - the link to try is https://driveuploader.com
Last week I gave a one week course about geographic scripting with gvSIG at a Master course of the University of potsdam.
As usual I want to share the material to those interested in looking at the course contents from home.
Leaving out the first introduction parts that do not involve any code, here are the good parts of the course:
The 3rd part introduces gvSIG's composer and a bit the language used: python
PART 3: THE SCRIPTING COMPOSER AND PYTHON
The 4th part dives into building of geometries and reading and writing of shapefiles.
PART 4: GEOGRAPHIC SCRIPTING
The 5th part gives an insight about raster data handling for scientific purposes.
PART 5: RASTER DATA
Part 6 has a few extras, like doing some charting and writing libreoffice spreadsheets
PART 6: FROM GEO INTO YOUR REPORT
I hope you will enjoy this course, it is a very practical one and therefore packed with tons of working code snippets.
Obviously, if you want me to come to your university giving the course, just get in touch with me. :-)
Desde la UPV nos confirman que habrá retransmisión en vivo de parte de las Jornadas gvSIG. En concreto de todas las actividades que se realicen en el “Salón de Actos” (donde se celebraran las ponencias a partir de la Sesión 4), el “Aula 0.1” y el aula “Ptolomeo”, con lo que se podrán seguir un buen número de ponencias y algunos talleres.
Los enlaces son:
Salón de actos:
Podéis consultar el programa de actividades en cada uno de los espacios aquí:
Filed under: events, spanish Tagged: 12as Jornadas Internacionales gvSIG, retransmisión en vivo, streaming
A maneira tradicional de usar os dados de um banco de dados é configurar uma tabela ou uma visão como uma nova camada no GeoServer. A partir do GeoServer 2.1.0 o usuário tem a possibilidade de criar uma nova camada, especificando uma consulta SQL, sem a necessidade de realmente criar uma view no banco de dados.
Uma SQL View parametrizada é baseada em uma consulta SQL que contém parâmetros cujos valores podem ser dinamicamente fornecido juntamente com as requisições WMS ou WFS. Um parâmetro é vinculado por sinais %, pode ter um valor padrão e deve ter sempre uma expressão regular de validação.
Para seu melhor entendimento, vamos criar uma camada a partir de um SQL View parametrizada:
1. A fim de criar uma SQL View parametrizada execute as etapas 1 e 2 do post anterior e depois insira os seguintes parâmetros:SELECT date_part('year'::text, t1.obs_datetime) AS obs_year, t1.storm_num, t1.storm_name, t1.wind, t2.wind AS wind_end, t1.press, t2.press AS press_end, t1.obs_datetime, t2.obs_datetime AS obs_datetime_end, st_makeline(t1.geom, t2.geom) AS geom FROM storm_obs t1 JOIN ( SELECT storm_obs.id, storm_obs.storm_num, storm_obs.storm_name, storm_obs.wind, storm_obs.press, storm_obs.obs_datetime, storm_obs.geom FROM storm_obs) t2 ON (t1.obs_datetime + '06:00:00'::interval) = t2.obs_datetime AND t1.storm_name::text = t2.storm_name::text WHERE date_part('year'::text, t1.obs_datetime) BETWEEN %MIN_OBS_YEAR% AND %MAX_OBS_YEAR% ORDER BY date_part('year'::text, t1.obs_datetime), t1.storm_num, t1.obs_datetime
2. Clique sobre os parâmetros do SQL. O GeoServer irá criar automaticamente os campos com os parâmetros especificados na visão:
3. Preencha alguns valores padrão para os parâmetros, de modo que GeoServer possa executar a consulta e inspecionar os resultados nas próximas etapas. Defina 2020 para MAX_OBS_YEAR e 0 para MIN_OBS_YEAR.
4. Atualize os atributos, verifique o SRID e publique a camada. Na configuração atribuia o estilo storm_track_interval estilo à camada como estilo padrão.
5. Clique em OpenLayers na tela de pré-visualização da camada v_storm_track_interval.
6. À primeira vista você não vai ver nada, já que a camada está usando os parâmetros padrão para os anos de observação. Especifique dois anos para a exibição adicionando este parâmetro no final do Pedido GetMap:&viewparams=MIN_OBS_YEAR:2000;MAX_OBS_YEAR:2000
Você deve obter uma requisição como esta:http://localhost:8083/geoserver/geosolutions/wms?service=WMS&version=1.1.0&request=GetMap&layers=geosolutions:v_storm_track_interval&styles=&bbox=-180.0,-90.0,180.0,90.0&width=660&height=330&srs=EPSG:4326&format=application/openlayers&viewparams=MIN_OBS_YEAR:2000;MAX_OBS_YEAR:2000
7. Agora você é capaz de ver os furacões e também escolher dinamicamente o intervalo de anos de observação de seu interesse.
Fonte: GeoSolutionsPosts RelacionadosZemanta
Thanks to everyone who took part in the code-freeze, monthly bug stomp, or directly making the release. This release is made in conjunction with GeoServer 2.10.0.
This release of GeoTools has been through a beta and release candidate test phase so everything you need should be working fine (or did you forget to test those releases):
- The library now defaults to using PreventLocalEntityResolver with XML parsers for improved security. For more details (and how to disable this behavior) please see the GeoTools user guide.
- The gt-wfs-ng client is taking over; please try it out with your Web Feature Services.
- CSS now supports rendering transformations.
- [GEOT-5544] - GridCoverageFactory throws NullPointerException on arg described as nullable
- [GEOT-5548] - Filter function "greaterEqualThan" is not properly working
- [GEOT-5549] - Regression: insert operations produce SQL that includes identity column; SQL Server plugin then fails to insert
- [GEOT-5550] - SQLServer date/time based filtering fails if the server is not in english locale
- [GEOT-5552] - XSD schema manipulation not fully synchronized causes concurrency issues
- [GEOT-5553] - SQLServerLobOnlineTest#testCreateSchema fails when using the Microsoft JDBC driver
- [GEOT-5554] - SQLServerJNDIDataSourceOnlineTest fails if the Microsoft driver is not in the classpath
- [GEOT-5542] - Update GT Pom to depend on ImageIO-EXT 1.1.16
- [GEOT-5473] - Add support for "excess granule removal" in image mosaic
- [GEOT-5526] - SQL Encoding of Filters on Nested Attributes
- The wfs-ng module is now a drop in replacement and will be replacing gt-wfs
- The NetCDF module now uses NetCDF-Java 4.6.6
Maven repository. This release is made in conjunction with GeoServer 2.9.3.
GeoTools 15.3 is a maintenance release focused on bug fixes. While this release is suitable for production systems we recommend planning your upgrade to GeoTools 16.
Features and Improvements:
- Upgrade to use of latest ImageIO-EXT 1.1.16 for raster formats
- SQL encoding of filters on nested attributes
- Rendering can now delegate band selection to coverage reader
- PostgreSQL 9.6 index sorting fixed
- Several SQL Server fixes including bulk insert, date/time fix filtering, compatibility with Microsoft JDBC driver
- SLD fix for rescaling graphic with anchor point
- Grid coverage
About GeoTools 15 What's new in GeoTools 15:
De 14 a 16 de Setembro aconteceu na cidade de Santa Maria/RS a 5ª Jornadas Brasileiras de gvSIG. Por convite da organização do evento, fiz uma apresentação (videoconferência) falando um pouco sobre a OSGeo e como o gvSIG está inserido nessa organização.
Por último, realizei o lançamento do FOSS4G Brasil 2017, evento que acontecerá no mês de Julho na cidade de Curitiba/PR.
Abaixo, segue o link da apresentação, para quem tiver interesse no assunto:Posts RelacionadosZemanta
As befits a patch release, the focus is on bugs and breakages.
The following packages can now be installed:
- qgis 2.18.0
- qgis-debuginfo 2.18.0
- qgis-devel 2.18.0
- qgis-grass 2.18.0
- qgis-python 2.18.0
- qgis-server 2.18.0
Installation instructions (run as “root” user or use “sudo”):su # Fedora 23, Fedora 24: dnf copr enable neteler/QGIS-2.18-Las-Palmas dnf update # note: the "qca-ossl" package is the OpenSSL plugin for QCA dnf install qgis qgis-grass qgis-python qca-ossl
Installation Instructions:su # Fedora 23+24: # install this extra repo dnf copr enable neteler/GDAL # A) in case of update, simply dnf update # B) in case of new installation (gdal-devel is optional) dnf install gdal gdal-python gdal-devel
Long live NRPP!
The Natural Resource Permitting Project (NRPP) is now mired down, having failed to deliver on its ambitious promises to transform the sector with “generic frameworks that will support the ‘One Project, One Process’ model”.
But, as my ‘ole grand-pappy used to say to me: “When the going gets tough, the tough redefine success so they can still declare victory.”
Accordingly, success re-definition is under way at NRPP. Success will no longer be a generational transformation in how government manages natural resources; success will now be submitting formerly paper forms using web forms.
But wait, I said NRPP is “mired down”, how can I tell? By measuring the outputs against the inputs.Lots of Money Going In
NRPP has been ongoing in various forms and names since before 2013, and for at least the last two years has been carrying a staff/consultant complement that I’d estimate costs about $17M per year. I’ve heard estimates of expenditures to date of over $50M, and that is consistent with my back-of-the-envelope calculations.
So, $50M or more in. What’s come out? (Worth remembering, successful $1B start-up companies have been built for less.)Not So Much Coming Out
In March of 2016, the Executive Director of NRPP gave a progress update to the Deputy Ministers Committee on Transformation and Technology (DMCTT). Good news: “year 2 of the initiative has been delivered on time, on scope and on budget”.
- Clients can now access NRS online services for guidance, information and map-based data to support applications for authorizations
- 290 data layers are now accessible through NRS Online Services
- Hunters will be able to register online for the Limited Entry Hunt in mid-April 2016
- Legislation will be introduced in Spring 2016 to move selected Fish and Wildlife authorizations to a criteria based notification model
All of these assertions are superficially true, but even from my perch far outside the warm light of the inner circles of government, it’s laughably easy to find substantial caveats and concerns about all four of them.
I really wonder what the is purpose of reporting to high-level “oversight” committees like DMCTT, if the committees just accept the reports and do not bother to do any independent verification and research.
If you have only the information and spin from the project in front of you, no matter how piercing and direct your analysis is, you’re never going to really be able to ask the tough questions, because the key information will be hidden or obfuscated.
This is why so much “oversight” seems to devolve into reductive discussions of schedule and budget, the only metrics that all participants are guaranteed to understand and that all projects are required to provide.
Feel free to deliver a product that fails to meet your user needs – the big boss will never notice. But slip your schedule by 2 weeks, and the fiery wrath of God will descend upon you. Project management and communication is optimized accordingly.
I want to look closely at each of the pieces of good news about “year 2”.Clients can now access NRS online services for guidance, information and map-based data to support applications for authorizations
Back? How did it go?
I’ll wager you didn’t find the actual Natural Resource Sector Online Services portal, which though online seems to be linked to from nowhere, outside or inside the BC government.
This puts the claim that “clients can now access NRS online services” a little in doubt. Sure, they “can” access the services, but since the services are basically hidden, do they access the online services?
This new portal is one of the products of the $50M spent so far. It has an “OK” design, a bit wasteful of screen real estate and bandwidth, but clean and not too “last century”.
The portal also has a bunch of content and links to existing processes, which would be more impressive if they were not duplicative of content and links already assembled and put on the web (in the last decade) by Front Counter BC.
The $50M folks at NRPP appear to have mostly taken the content from Front Counter BC and re-skinned it using their modern web design, but provided vanishingly little value beyond that.
- Here’s the info on forestry activities from Front Counter and here’s the info from NRPP.
- Here’s the info on Christmas Tree permits from Front Counter and here’s the info from NRPP.
Re-packaging existing in-house knowledge and claiming it for your own is an old consultant trick from way, way back. Mark Twain once joked that “an expert is anyone who comes from more than 60 miles away”, and little seems to have changed since his time.290 data layers are now accessible through NRS Online Services
Indeed they are, at least quite a few layers, I didn’t bother to count. However, like the portal, the mapping application is a recapitulation of functionality that government has been providing for a decade. Way back in 2002, the “Ministry of Sustainable Resource Management” was tasked to “deliver a corporate land and resource information data warehouse”: that is, a collection of all the land information in BC, and a web view of those layers. The warehouse and web maps have been around in various forms ever since.
In many technical respects the NRS map is superior to the old ImapBC (it’s more modular and reusable) but for the purposes of this post note that (a) like the portal it’s carefully hidden from public view and (b) it’s still not a net-new gain of functionality on a project that’s $50,000,000 in.Hunters will be able to register online for the Limited Entry Hunt in mid-April 2016
Again, this is true, but yet again there’s less there than meets the eye. NRS was going to transform resource tenuring: one account for all users; new modern and modular technology; change the way the land base is managed.
None of that has happened here.
I took the app for a test drive (not so far as applying for a license, though maybe I should have) and what stuck out for me is:
- The business process is basically “paper form on the web”. You still need a special “Hunter Number” to apply – the business process clearly hasn’t been transformed at all, nor integrated into a “one process” framework.
- Technologically, if you peel back the web code and look underneath, the whole thing is being managed by a system called “POSSE”.
So the “new” Limited Entry Hunt app has the same smell as the portal itself. Finding themselves unable to meet their stated goals of business transformation and new technology, NRPP is now building Potemkin deliverables using old business process and old technology.
Of course, having met one deliverable by giving up on “transformation” and just stuffing existing business process into web forms, what are the odds that NRPP will go on to do the same for the whole portfolio and then declare “victory”? Very high, very high indeed.Legislation will be introduced in Spring 2016 to move selected Fish and Wildlife authorizations to a criteria based notification model
This was the only promise not tied to technology deliveries, and sadly it looks like it perished at the hands of a government too tired out to pass substantive legislation. I searched the Hansard for the spring 2016 session and did not find any evidence that the legislation was introduced.Recap
On one side of the ledger:
- $50,000,000 in spending, an army of consultants and staff, in fact a whole “Transformation Secretariat”.
On the other side of the ledger:
- A “portal” nobody can find, full of content other people assembled.
- A map nobody can find, full of content that has been accessible for a decade.
- An app built on old technology using the same old business process.
- Legislation that was not introduced.
Here are the things we cannot blame this on:
- Stupid people
- Bad intentions
- Political shenanigans
- Graft or corruption
Here are the things we can blame this on:
- Excessive size and ambition of the project
- Elevation of process over product
NRPP was/is a mistake. It’ll deliver something, in the end, but that something won’t be worth 10% of the money that is spent to achieve it. Hopefully NRPP is the last of the “transformation” projects to come out of government, and future business process improvement/integration efforts can evolve incrementally over time, at 10% of the cost and 10% of the risk.
The Open Source story is a bit like a fairy tale. Highly motivated developers, joyfully beavering away, in the middle of the night,to create high quality software systems,which they give away for free.
Why do so many people give away so much of their time?Why are these volunteers so effective?Why does open source work?Why has the business world found the open source formula so hard to replicate?
Surprisingly, many of the answers are found of our core morals and ethics.
The question of Open verses Proprietary actually breaks down into a series of sub questions.
- Should you use Free Software or Free Data?
- Should you design systems using Open Architectures and Open Standards?
- Does it make sense to contribute back to communities?
- Is there a business case to help lead community initiatives?
- And if so, should you help scale community and tap into the world’s collective intelligence?
This is a big topic and we have limited time, so I will focus on some of the key messages, mostly at the “use and implement” end of the continuum.
Lets start by asking why you might use Open Source GIS Software?If you are starting from scratch, the answer is simple. There is a comprehensive stack of mature, widely used and widely supported Open Source Geospatial applications, all available for free.This is a screenshot from the OSGeo-Live software distribution. OSGeo-Live includes 50 of the best geospatial Open Source applications, along with sample data, project overviews, and quickstarts for each application.Lets look at a few of the more popular applications:
QGIS is a desktop GIS application similar to ArcGIS with comparable features, but it free.
Cesium provides a 3 dimensional globe of the earth, like Google Earth, but free.
GeoServer is a map rendering server, similar to ArcGIS Server.It is the reference implementation for a number of the OGC standards, and is … free.
PostGIS adds spatial functionality to the Postgresdatabase.It is comparable in maturity, stability, performance and features to Oracle Spatial and Microsoft SQL Server, except it is … free.
For free data, you can use Open Street Map, and Open Route Map. This data is typically pretty good, and suitable for most use cases, but still not as consistent as datasets such as Google Maps.
Ok, so the software and data can be free, but there is more to applications than just the purchase price.There is deployment, maintenance, training, support.And who are you going to call at 2am in the morning if something goes wrong?
And that is where companies like Jirotech, EnterpriseDB and Redhat step in.They backfill the capabilities of organisations deploying these free applications with enterprise level support and services.
So we have covered the first obvious question, “Does open source compete favourably feature-for-feature?” It does.But we have just started. When considering an organisations’ technical roadmap, there are more reasons for selecting Open strategies.Lets start by considering some of the characteristics of the digital age.
The amount of information in the world doubles every two years.And the amount of software created is innovating at a similar rate.
Odds are that any software you own will be out-innovated within a year or two.Your software is not an asset!Your software is a liability!It needs to be updated, maintained, and integrated with new systems.It is technical debt, and you should try to own as little of it as possible.You can achieve this by purchasing Proprietary Software, by using Software as a Service, or by leveraging Open Source.
Because software is so time consuming to create and so easy to copy, it is excessively prone to monopolies.This holds true for both proprietary and open source products. A product that becomes a little better than its competitors will attracts users, developers and sponsors, which in turn allows that product to grow and improve quickly, allowing it to attract more users. This highly sensitive, positive feedback leads to successful software projects becoming category killers.Where Open Source and Proprietary business models differ is how they respond to monopolies. Proprietary companies are incentivised to lock out competition and increase prices as much as the market will bear. However, the open source licenses are structured such that multiple companies can support the same open source product, so the market self corrects any tendencies toward price-fixing.
This leads us to Vendor Lock-In. Vendor Lock-In occurs when replacing a vendor’s product would significantly impacts your business.It is a significant risk, as vendors then have excessive influence on price and your future technical design options.There are two key strategies to mitigate against vendor lock-in.
- Is to use open source, as multiple vendors can all support the same codebase.
- Is to design modular architectures based on open standards.
Using modular architectures:
- reduces system complexity,
- which reduces technical risk,
- and facilitates sustained innovation.
It means you can improve one module, without impacting the rest of your system.This helps with maintenance, innovation, and keeping up with latest technologies.
Committing to and sustaining a modular architectures requires continual vigilance and forward thinking, especially when acquiring new systems.There will always be quick fixes and vendors offering more features if you are prepared to accept a level of lock-in.You should be considering:
- Long term maintenance,
- Ability to integrate with other systems,
- And the cost of a future exit strategy.
What I’ve described so far is practical, main stream advice.Using open standards, open source and open data is now promoted in government policies and purchasing guidelines, and can be justified based on sound traditional economics.But the Open Source culture is not based on traditional economics.
Open Source and Open Data communities are usually founded on gift cultures, and continue to retain the principles of the gift culture in their DNA. If you wish to successfully engage with these open communities, If you wish to have these communities adopt and maintain your codebase,It helps to understand and respect these gift cultures.And this starts by understanding our human desires to do things intrinsically good and valuable.
Which brings us to the topic of motivation. While traditional carrot and stick incentives improve motivation for boring, mechanical type tasks, research has shown it to be counter-productive for higher order thinking, such as creative software.Dan Pink has collated this motivational research into a compelling book called Drive wherehe describes how we humans are wired with deeper and more effective motivations. Namely: …
Autonomy, the desire to be self directed.
Mastery, the urge to get better at stuff.
And purpose, the desire to do something with meaning and importance.So if we facilitate the collaboration of highly motivated people, with the interconnectedness of the internet, and provide them with creative tools, amazing things happen.
- Wikipedia which has displaced Encyclopedia Britannica as the authoritative information source,
- And Linux which is the dominant operating system in IT service centres,
- And Open Street Map, which provides detailed maps of the entire world,
- And the OSGeo-Live distribution of Open Source Geospatial Software, a project I’ve been involved in for close to 10 years and which has attracted hundreds of contributors.
So how does this translate to attracting and engaging communities?Professor Charles Schweik tackled this question. He and his team studied thousands of Open Source projects to identify common characteristics of successful projects, and they came up with some interesting findings. Like:
- Most successful open source projects are small, with just 1, 2 or 3 developers. This is surprising if your exposure to Open Source is through the media stories which almost exclusively reference large projects such as Linux or Android.
- Also, most open source projects are abandoned. 5 out of 6 according to Charle's research.
But this is not a weakness, the low success rate is actually a good thing.Developers vote with their time, and only great projects survive.
Also, when your developers are also users, wanting to scratch an itch, they are the best qualified to decide what is best for a project.
And when your developers are motivated by Autonomy, Mastery and Purpose, they will be motivated to spend extra time to “Get things right” rather than compromise on quality.
What Charlie's team found from their research was that successful projects usually possess:
- A clearly defined vision,
- Clear utility,
- And leaders who lead by doing.
- Attract an active community.
- Provide fine scaled task granularity, making it easier for people to contribute.
- And often benefit from attracting financial backing.
Attracting volunteers involves helping maximise the unique, intrinsic value a person can contribute based on their limited time available.Effectively maximise the usefulness and moral return on effort.
This starts with a clear and compelling vision, inspiring enough that others want to adopt the vision and work to make it happen.This should be followed by a practical and believable commitment to deliver on the vision. Typically this is demonstrated by delivering a “Minimum Viable Product”.
Then you need to be in need of help, preferably accepting small modular tasks with a low barrier to entry, and ideally something which each person is uniquely qualified to provide.If anyone could fix a widget, then maybe someone else will do it. But if you are one of a few people with the skills to do the fixing, then your gift of fixing is so much more valuable, and there is a stronger moral obligation for you to step up.
As an example, I’ll reference the OSGeo-Live project I’ve been involved in.Ten years ago, the Open Source Geospatial Foundation was a collection of Open Source applications, but lacked consistent marketing and was difficult for new users to navigate and understand. So we proposed to package all the applications on a DVD, ready to run, with sample datasets and consistent documentation. This was our vision.We then created a minimal first version of the distribution, demonstrating our commitment. As some of us were on the organising committee of the next international geospatial Open Source Conference, we committed to hand out the DVD at the conference, creating a targeted marketing pipeline. This provided clear value for the developers we were recruiting. Then we provided simple guides on how to write installers and documentation and went to the open source developers saying:“If you package your application and write documentation, like this…, then you can tap into a targeted marketing pipeline”. This made it easy for developers to provide discrete and uniquely valuable contributions. And it worked. We have attracted 100s of volunteers, to package 50+ projects, with documentation translated into over 10 languages, which is updated every 6 months.
Ok, so maybe you might be thinking that giving back to open communities might be noble, worthy, the right thing to do.But there is no way you’d be able to justify it to management. You wouldn’t be the first to face this dilemma. In fact, we are often invited to help with various permutations to this question.In helping we typically mention “Opportunity Management”.Opportunity Management is the reverse of Risk Management. However, instead of identifying what could go wrong and putting strategies in place to prevent it, you identify things that could go right, then put strategies in place to help make it happen.Help an open source community, and the number of users, developer and sponsors will grow, and you will indirectly reap the benefits.
So what have we covered?
- Software is a liability.
- Minimise your technical debt.
- Design modular architectures with Open Standards.
- It reduces vendor lock-in.
- There is a breadth of Open Source applications which are feature rich, mature and commercially supported.
- And there is Open Data available to address many of your use cases.
To take things to the next level, to engage with Open Source communities and tap into their collective creativity, you should re-learn how gift cultures work.The beautiful part to this is that it involves reconnecting with our inner morals and ethics, and doing the right thing.
These questions vary from supervised classification technique to software issues, and can be useful to the readers of this blog for solving issues about the use of SCP.
The PostGIS development team is pleased to announce the release of PostGIS 2.2.4 As befits a patch release, the focus is on bugs and breakages.