Stratatomic Wins 2021 WebAward for Copper River Grill Website
Stratatomic was recently honored with a
2021 WebAward™ for Outstanding Website in the Restaurant category for the Copper River Grill website available online at
CopperRiverGrill.com.
In its 25th year, the WebAward program is the longest running annual website award competition dedicated to naming the best web sites in 96 industries while setting the standard of excellence for all website development. The annual WebAward competition is produced by The Web Marketing Association, founded in 1997 to help set a high standard for internet marketing and development of the best websites on the World Wide Web. To learn more, please visit the WebAward website at
webaward.org.
The new website debuted in 2020 and is the third site developed for Copper River Grill over the course of a business relationship that dates back to 2006, and includes a number of enhancements that have helped CRG keep pace with changing trends in design and technology. The award-winning website features
online ordering, full lunch and dinner menus with wine and bar selections, Early Bird menu, Ultra Fit menu, gluten-sensitive menu, family-style takeout menu, locations, careers and social media integration. One of the highlights is the incorporation of the
Support Animal Rescue page which serves as a fundraiser for
Auction For A Kaws, a non-profit organization with a focus on providing housing, veterinary care and rehabilitation of abused and neglected animals. The page features the
Animal Rescue Hall of Fame, showcasing a gallery of photos of local animals that have been rescued. Copper River Grill utilizes Stratatomic's
WebAdmin™ technology to maintain the gallery and keep it up to date with new rescues.
Dan Angell, CEO of Copper River Grill, remarked: "We are extremely pleased with our new site. The menu presentation is better than ever, the site is very easy to navigate, and the photos are great.
We have had more traffic and we have increased our gift card sales and E-Club memberships since the site went live. We are also very happy that our new website has increased awareness and support for local animal rescue groups in our communities. Thanks to Ryan Owens and the team at Stratatomic for their hard work, creativity and insight."
Stratatomic was previously honored with a
2012 Horizon Interactive Award for the second iteration of the CopperRiverGrill.com website as well as a
2014 Horizon Interactive Award (Gold) for the Copper River Grill Email Campaign.
Stratatomic Wins 2020 Horizon Interactive Award for ISOFlex Packaging Website
Stratatomic was recently honored with a
2020 Horizon Interactive Award in the Corporate / B2B Website category for the ISOFlex Packaging website available online at
ISOFlexPackaging.com.
The Horizon Interactive Awards is a leading international interactive media awards competition, with the 2020 competition receiving just over 600 entries from around the world including 34 out of 50 US States and 14 other countries including: Australia, Canada, Germany, Hong Kong, India, Ireland, Qatar, Russia, South Korea, Sri Lanka, Switzerland, Taiwan, Turkey, and the UK.
In its 19th year, the
Horizon Interactive Awards was created to recognize excellence in interactive media production worldwide. Since 2001, the competition has received tens of thousands of entries from many countries around the world and nearly all 50 US States. Each year, those entries are narrowed down to the "best of the best" to be recognized and promoted on an international stage for their excellence. The judging process involves a blend of the Horizon Interactive Awards advisory panel and an international panel of volunteer judges consisting of industry professionals. Winning entries have been dubbed the "best of the best" in the interactive media industry.
The Horizon Interactive Awards holds the competition each year with the winners being announced the following April. For more information visit the Horizon Interactive Awards online at
horizoninteractiveawards.com.
ISOFlex Packaging meets the critical needs of the laminating, printing, coating and converting markets. The company's films serve a broad variety of end-use markets, including food and beverage, consumer products, industrial film and bags, agricultural products, medical and pharmaceutical and other innovative products. As part of the Sigma Plastics Group, the largest privately-owned film extrusion group in the U.S., Canada and Central America, ISOFlex Packaging operates seven facilities throughout the U.S. and has a total film and bag capacity of 350 million pounds. ISOFlex Packaging is a Certified Minority Business Enterprise and ISO 9001:2000 Certified enterprise, with 7 locations throughout the U.S., including Chicago, IL, Gray Court, SC, Nashville, TN, Pompano Beach, FL, Statesville, NC, Vancouver, WA and Washington, IN. For more information, visit them online at
ISOFlexPackaging.com.
Google's Next Phase:
Context is King
Content has ruled the internet and Search Engine Optimization (SEO) strategies for some time now. Stratatomic has always emphasized the importance of creating good, quality Content to our clients and we'll take care of the rest. Optimizing that content means making sure Google can find and understand it, and then in turn deliver the most relevant content to its users.
As the march towards Artificial Intelligence goes on, Google continues to do more of the work for both content creators as well as search users. As with anything, this can be both good and bad, with
unintended consequences.
Ultimately, that means Google has more and more control over the search results that we see, and in some cases the answer or information may be returned directly in the Google results themselves, offering no real reason or even opportunity for the search user to visit the actual website that delivered the content in the first place. You've probably already noticed this in the search results and "snippets" you see on a daily basis, and this will only continue to grow.
Just as COVID intensified the shift to a digital economy, the AI revolution comes at time when
the browser "cookie" is destined to die, or at least evolve into something else entirely. We've seen
Google Analytics update to this new reality, and the results are not good, at least as the technology stands today and compared to what we are used to. As Google and Big Tech move towards more privacy protections (not necessarily willingly) for their users, the user profiles and hordes of data they've been gathering on us all will change and adapt too, as they seek to replace all of that data with better AI, machine learning and other secret things they aren't telling us about.
And if you haven't been paying attention, just look at how the Search Engine Results Pages (SERP) appear now, with more and more ads squeezing out the organic results. These paid ad placements were once more clearly delineated, but have evolved to cleverly blend in with the organic results. On some SERP pages you may see more ads than organic listings, but it has happened so gradually that by now most people may not notice the difference. Of course and as always, the answer to all of your questions is Money. Google is an Advertising company, lest you forget.
Rise of the Machines
At its
Search On event last week, Google introduced several new features that, taken together, are its strongest attempts yet to get people to do more than type a few words into a search box. By leveraging its new
Multitask Unified Model (MUM) machine learning technology in small ways, the company hopes to kick off a virtuous cycle: it will provide more detail and context-rich answers, and in return it hopes users will ask more detailed and context-rich questions. The end result, the company hopes, will be a richer and deeper search experience.
Google SVP Prabhakar Raghavan oversees search alongside Assistant, ads, and other products. He likes to say — and repeated in an interview this past Sunday — that "search is not a solved problem." That may be true, but the problems he and his team are trying to solve now have less to do with wrangling the web and more to do with adding context to what they find there.
AI Will Help Google Explore the Questions People Are Asking
For its part, Google is going to begin flexing its ability to recognize constellations of related topics using machine learning and present them to you in an organized way. A coming redesign to Google search will begin showing "Things to know" boxes that send you off to different subtopics. When there's a section of a video that's relevant to the general topic — even when the video as a whole is not — it will send you there. Shopping results will begin to show inventory available in nearby stores, and even clothing in different styles associated with your search.
For your part, Google is offering — though perhaps "asking" is a better term — new ways to search that go beyond the text box. It's making an aggressive push to get its image recognition software Google Lens into more places. It will be built into the Google app on iOS and also the Chrome web browser on desktops. And with MUM, Google is hoping to get users to do more than just identify flowers or landmarks, but instead use Lens directly to ask questions and shop.
"It's a cycle that I think will keep escalating," Raghavan says. "More technology leads to more user affordance, leads to better expressivity for the user, and will demand more of us, technically."
Google Lens will let users search using images and refine their query with text. Image: Google
Those two sides of the search equation are meant to kick off the next stage of Google search, one where its machine learning algorithms become more prominent in the process by organizing and presenting information directly. In this, Google efforts will be helped hugely by recent advances in AI language processing. Thanks to systems known as large language models (MUM is one of these), machine learning has got much better at mapping the connections between words and topics. It's these skills that the company is leveraging to make search not just more accurate, but more explorative and, it hopes, more helpful.
One of Google's examples is instructive. You may not have the first idea what the parts of your bicycle are called, but if something is broken you'll need to figure that out. Google Lens can visually identify the derailleur (the gear-changing part hanging near the rear wheel) and rather than just give you the discrete piece of information, it will allow you to ask questions about fixing that thing directly, taking you to the information (in this case, the excellent Berm Peak YouTube channel).
The push to get more users to open up Google Lens more often is fascinating on its own merits, but the bigger picture (so to speak) is about Google's attempt to gather more context about your queries. More complicated, multimodal searches combining text and images demand "an entirely different level of contextualization that we the provider have to have, and so it helps us tremendously to have as much context as we can," Raghavan says.
We are very far from the so-called "ten blue links" of search results that Google provides. It has been showing information boxes, image results, and direct answers for a long time now. Today's announcements are another step, one where the information Google provides is not just a ranking of relevant information but a distillation of what its machines understand by scraping the web.
In some cases — as with shopping — that distillation means you'll likely be sending Google more page views. As with Lens, that trend is important to keep an eye on: Google searches increasingly push you to Google's own products. But there's a bigger danger here, too. The fact that Google is telling you more things directly increases a burden it's always had: to speak with less bias.
By that, I mean bias in two different senses. The first is technical: the machine learning models that Google wants to use to improve search have well-documented problems with racial and gender biases. They're trained by reading large swaths of the web, and, as a result, tend to pick up nasty ways of talking. Google's troubles with its AI ethics team are also well documented at this point — it fired two lead researchers after they published a paper on this very subject. As Google's VP of search, Pandu Nanak, told The Verge's James Vincent in his article on today's MUM announcements, Google knows that all language models have biases, but the company believes it can avoid "putting it out for people to consume directly."
A new feature called "Things to know" will help users explore topics related to their searches. Image: Google
Be that as it may (and to be clear, it may not be), it sidesteps another consequential question and another type of bias. As Google begins telling you more of its own syntheses of information directly, what is the point of view from which it's speaking? As journalists, we often talk about how the so-called "view from nowhere" is an inadequate way to present our reporting. What is Google's point of view? This is an issue the company has confronted in the past, sometimes known as the "one true answer" problem. When Google tries to give people short, definitive answers using automated systems, it often ends up spreading bad information.
Presented with that question, Raghavan responds by pointing to the complexity of modern language models. "Almost all language models, if you look at them, are embeddings in a high dimension space. There are certain parts of these spaces that tend to be more authoritative, certain portions that are less authoritative. We can mechanically assess those things pretty easily," he explains. Raghavan says the challenge is then how to present some of that complexity to the user without overwhelming them.
Can Google Remain Neutral if it's Delivering Answers to Users Directly?
But I get the sense that the real answer is that, for now at least, Google is doing what it can to avoid facing the question of its search engine's point of view by avoiding the domains where it could be accused of, as Raghavan puts it, "excessive editorializing." Often when speaking to Google executives about these problems of bias and trust, they focus on easier-to-define parts of those high-dimension spaces like "authoritativeness."
For example, Google's new "Things to know" boxes won't appear when somebody searches for things Google has identified as "particularly harmful/sensitive," though a spokesperson says that Google is not "allowing or disallowing specific curated categories, but our systems are able to scalably understand topics for which these types of features should or should not trigger."
Google search, its inputs, outputs, algorithms, and language models have all become almost unimaginably complex. When Google tells us that it is able to understand the contents of videos now, we take for granted that it has the computing chops to pull that off — but the reality is that even just indexing such a massive corpus is a monumental task that dwarfs the original mission of indexing the early web. (Google is only indexing audio transcripts of a subset of YouTube, for the record, though with MUM it aims to do visual indexing and other video platforms in the future).
Often when you're speaking to computer scientists, the
traveling salesman problem will come up. It's a famous conundrum where you attempt to calculate the shortest possible route between a given number of cities, but it's also a rich metaphor for thinking through how computers do their machinations.
"If you gave me all the machines in the world, I could solve fairly big instances," Raghavan says. But for search, he says that it is unsolved and perhaps unsolvable by just throwing more computers at it. Instead, Google needs to come up with new approaches, like MUM, that take better advantage of the resources Google can realistically create. "If you gave me all the machines there were, I'm still bounded by human curiosity and cognition."
Google's new ways of understanding information are impressive, but the challenge is what it will do with the information and how it will present it. The funny thing about the traveling salesman problem is that nobody seems to stop and ask what exactly is in the case, what is he showing all his customers as he goes door to door?
The article above is excerpted from theverge.com.