A glimpse of HTML 5.1
The release of the HTML5 standard about two years ago was a big deal in the web development community. Not only because it came packing an impressive list of new features, but also because it was the first major update to HTML since HTML 4.01 was released in 1999. You can still see some websites bragging about the use of the “modern” HTML5 standard today.
Fortunately, we didn’t have to wait quite that long for the next iteration of HTML. In October 2015, the W3C started working on the draft of HTML 5.1 with the goal of fixing some of the issues that were left open in HTML5. After many iterations, it reached the state of “Candidate Recommendation” in June 2016, “Proposed Recommendation” in September 2016 and finally a W3C Recommendation in November 2016. Those who followed this development probably noticed that it was a bumpy ride. A lot of initial HTML 5.1 features were dropped due to poor design or a lack of browser vendor support.
While HTML 5.1 was still in development, the W3C has already started working on a draft of HTML 5.2 which is expected to be released in late 2017. In the meantime, here’s an overview of some of the interesting new features and improvements introduced in 5.1. Browser support is still lacking for these features but we’ll refer you to at least some browsers which can be used to test each example.
The following SitePoint article covers the following:
Context Menus Using the
Details and Summary Elements
More input types —
Responsive Images ( The
srcsetImage Attribute, The
sizesImage Attribute, The
Validating Forms with
Allowfullscreen for Frames
Read the article here: SitePoint
Clinical trials were conducted with the help of paper based case report forms (CRFs) for
decades. This was – being -changed during the last 30 years. In this presentation I show
how the electronic CRFs (eCRFs) superseded the paper based CRFs gradually.
There was a very complex regulation and well established practice for managing of paper
CRFs by the 80’s. It turned out that this regulation and practice cannot be adequately applied
to eCRFs. The processes of randomisation, query handling or adverse event reporting
should have been reconsidered.
Safety concerns are always present with respect to a computer system. But if this system is
for collection of sensitive data in order to obtain a valuable pattern, these concerns are
increased. I give examples how the regulation – Title 21 CFR Part 11 – controls of data
It is also important to understand how other standard, like CDISC (Clinical Data Interchange
Standards Consortium) and their development influence the implementation of eCRFs.
In a study design the choice between a paper based and an electronic CRF is partly an
economical question. This aspect was already investigated in detail, and I will summarize
how costs of the two tools can be compared.
At the end I would like to give a view on the most probable future of the eCRF concept.
Today’s challenge is a geographical one. Do you know which cities are the most populated cities in the world? Do you know where they are? China? USA? By way of contrast, do you know which cities are the smallest cities in the world?
Today we want to show you where you can find the largest and the smallest cities in the world by population on a map. While there is general agreement from trustworthy sources on the web about which are the most populated cities, agreement becomes sparser when looking for the smallest cities in the world. There is general agreement though about which ones are the smallest capitals in the world.
We collected data for the 125 world’s largest cities in a CSV text file and data for the 10 smallest capitals of equally small and beautiful countries in another CSV text file. Data includes city name, country, size in squared kilometers, population number, and population density. The challenge of today is to localize such cities on a world map. Technically this means:
- To blend the city data from the CSV file with the city geo-coordinates from the Google Geocoding API into KNIME Analytics Platform
- Then to blend the ETL and machine learning from KNIME Analytics Platform with the geographical visualization of Open Street Maps.
The benefits you will realize by choosing a remote development agency can be significant.
Remote teams often have more experience and greater talent than local coding enterprises. They service the global marketplace, and have a solid understanding of recent trends and technologies.
Good remote coders have learned and practiced ways to become highly productive. And they constantly strive to improve their skills.
Being remote not only requires better communication, it forces it to happen. How does this happen? Weekly meetings, chat sessions, and the exceptional clarity in briefs, specifications, and feedback. These are all part of the overall scheme.
Keep these four golden rules in mind when working with a remote agency:
- A clear brief helps to get a project off to a solid start. Collaboration can get underway even before your brief is finalized when you have a good agency working with you. Be willing to listen to any guidance the project manager may offer. It can help to get everyone on the same page as the project gets underway.
- Practicality is important. Your coding agency should advise you on the coding best practices and the technologies that they plan to put into play. The final decision is yours, but it should be a knowledge-based decision.
Technologies may vary, but the maker’s touch is essential.
- Communicate efficiently and effectively. Understand your responsibilities toward making collaboration work by preparing yourself for periodic briefings. Avoid making changes once the project is in development. If you must do so, be prepared to negotiate new deadlines. Work with the agency to determine which means of communications will work best.
- Listen to the specialists. Professional developers always have the user in mind. Criticism is intended to be constructive, and it is given with the best of intentions. The feedback you may receive is based on solid, variate coding experience.
Read full article: here
The Progressive Web Apps (PWA) technology developed by Google has been available to the public for almost a year, but relatively few people outside the world of hardcore developers are aware of what exactly they are, and how they can use them to their benefit.
It’s important to recognize that there are a few different concepts bundled up into the term “Progressive Web App” and that these individual parts have been available in one form or another before being tied together under one package. Those parts are: Service Worker, App Shell and JSON Manifest.
The Service Worker
The single most interesting component is that of the service worker script. This script acts as an additional layer between the website requests and the internet servers around the world.
It is also responsible for caching content when a visitor browses a PWA enabled page, and stores that data locally on the visitor’s device, whether that is a mobile phone, tablet or desktop computer.
This means that every time we click a link on a website with a Progressive Web App, the request will pass through the service worker script and then based on the rules set forth, will go online and ask for a new web page.
Alternatively, if the user is offline it is possible to have the service worker serve a cached page from the local storage, meaning that we can now design websites that will work 100% when browsed offline, as long as that user has been to the site at least once before.
While the idea of caching content and serving it to users without internet access is not by any means a new one, the combination with an app shell is powerful and offers entirely new ways of thinking about web development.
Read more: Here
Ahead of his talk at Generate Sydney, the man behind ‘A Dao of Web Design’ offers a glimpse into the internet’s future.
Don’t miss John Allsopp’s opening session – Predict the future with this one weird old trick – at Generate Sydney on 5 September. Can’t get there? There are Generate conferences coming up in San Francisco and London too!
If anybody should be able to predict the future, John Allsopp is more qualified than most. He is, after all, the man who just might have coined the inglorious term ‘Web 2.0′ (he can document his usage of it before Tim O’Reilly popularised the phrase).
Allsopp is also feted as one of responsive design’s founding fathers. His essay, ‘A Dao of Web Design‘ was published in 2000. In it he encourages designers to let go of print’s rigidity and embrace the web’s fluidity. Ethan Marcotte cites Allsopp’s essay as one of his key inspirations.
So what does Allsopp – who’s speaking at our Generate Sydney event in September – predict for the next century? You may be somewhat surprised by what he has to say…
Read the article here: Creative Bloq