Rephrase it – Paraphrase Online https://www.paraphrase-online.com/blog Creative Writing Blog Mon, 04 Apr 2022 06:10:42 +0000 en hourly 1 https://wordpress.org/?v=5.0.16 The basics of social media marketing https://www.paraphrase-online.com/blog/social-media/the-basics-of-social-media-marketing/ Mon, 07 Jun 2021 05:18:58 +0000 https://www.paraphrase-online.com/blog/?p=1174 Continue readingThe basics of social media marketing]]> For many years, social media has been an integral part of the Internet, and for a large part of the population also professional and personal life. On Facebook, Instagram or TikTok, we buy, sell, express ourselves and establish relationships. They are living organisms made up of millions of digital ants. Social media is constantly evolving, adjusting to trends in society, and often creating them themselves. It is also giant marketing platforms that enable wide brand exposure, communication with recipients and, consequently, increase sales results.

Social media is not homogeneous – although their core remains similar, individual websites use slightly different solutions, have a different user interface, focus on different customer groups and approach data management differently. However, you can find many common features and basic principles that should guide a social media specialist in carrying out his work. These are the foundations that may seem obvious to an experienced marketer, but – as practice shows – in surprisingly many cases they are neglected.

Don’t lie

Almost everyone uses social media – we will see the entire cross-section of society there. Generation Z has naturally come to the digital voice in recent years, growing up in constant contact with computers, smartphoness and the Internet. This means that the new generation can – in general – verify data better. People “born with a smartphones in their hand” are much more efficient than previous years of using long tail keyword searches and controlling the accuracy of your message. It was required by their constant presence on the Internet – the constant bombardment with messages naturally made it necessary to filter them.

What does this mean for the marketer? It is more and more likely that the average social media user will say “check” and do a decent research on the product and the entire brand. Only transparency in communication will guarantee that it will not end in a crisis. Basing your offers and image on half-truths today is more risky than ever, and sooner or later marketing manipulations usually turn against dishonest scammers.

Don’t limit yourself to one medium

Does your brand only have a website and Facebook profile? It may not be enough. Diversification of traffic sources has even become a necessity. Your profile on a given portal may disappear in a fraction of a second, for example due to an algorithm error – of course, you will probably regain access to your account, but it may take some time. In such a situation, each day cuts you off from the sales channel, which is in fact irreplaceable. If you don’t separate your traffic sources, you are completely dependent on one of them. Only a skillful combination of SEO, Google Ads and marketing on various social networks will bring meaningful results.

Match the message to the platform

Each social media has its own rules – it results from both the technicalities and the general target group of a given portal. There are of course some universal guidelines, but each portal is a kind of a separate world with a separate communication code. Twitter doesn’t post long elaborations, TikTok is ruled by video – of course you can fight it and go against the tide on a given platform, but most likely it will just end up with very poor results. Use especially the technical guidelines of a given portal, for example, choose the size and format of graphics carefully for the platform.

Don’t count on organic

You refresh your Facebook feed, and after pressing F5 you see completely different posts than a second ago. You set Facebook as the “show newest” user and a moment later you see again what the algorithm has chosen for you. Big data rules social media – organic reach is successively decreasing, and portals find new ways to monetize users’ activities every day.

Although it is less and less frequent, many social media specialists still believe that it is better to publish, for example, 4 or 5 sales posts a week, instead of one, but “burnt” with a decent campaign, significantly increasing the reach and click-throughs. Overproduction of content on social media disturbs rather than helps, and by engaging the time of graphic designers and copywriters, such an approach can actually increase the financial and time cost, rather than optimize it.

Don’t look at the number of likes

The number of likes really matters little. Of course, a large number in the profile adds to a certain digital prestige, which is important for potential collaboration with other brands and overall brand perception on the Web, but usually the impact of this factor is horribly overestimated.

Facebook is slowly withdrawing from showing this value on fan pages, which means that the exposure of the number of “likes” decreases (the number of followers of a given page remains public, however). Algorithms reduce the reach of posts – organic traffic in social media experiences difficult moments without any prospects for improvement. Your posts won’t be shown to all subscribers without your ad investment.

It can be assumed that if the fanpage was set up a few years ago, a certain number of users no longer use a Facebook account. These dead souls are of no value to the marketer – an inactive user is not a potential customer.

The same is true when we decide to buy followers – apart from ethical issues, it will not actually help us sell the product in any way. Fake accounts “clog up” the space to display our posts and advertisements that could be used for marketing purposes. The fake account will not buy your product, which will burn through your advertising budget. It is also prestigiously a complete shot in the foot – the flood of likes from suspicious-looking foreign accounts will light a red lamp for anyone with anything to do with social media marketing.

Use Stories

Facebook, Instagram, Twitter, and even recently LinkedIn – all these portals strongly focus on fleeting content in the form of reports. Social media posts have always been characterized by a lot of flyby, and the Stories format makes the shelf life of a message shorter than ever.

However, the data show that a skilfully prepared relationship can bring great reach and sales effects. In order to conduct effective marketing in social media, you cannot afford to disregard this format.

React quickly and use RTM

It is not only about responding to customer questions instantly, although this is of course also a very important aspect of social media marketing.

Is the occasional post for Valentine’s Day, the football world cup or the presidential election real-time marketing? Theoretically yes, because it concerns current events, but usually such activities have zero marketing power. They are simply predictable – we have known the date of these events for a long time, we see news about it long before the actual event, so there is no chance for a surprise effect.

Good real-time marketing is based on spontaneity, instinct and quick reaction. When an internet buzz about a topic breaks out, it usually fades away after a few days. “Late” and “imitative” are two words that should definitely not be associated with real-time marketing. In many companies, this is of course due to the need to “pass” a given post through all levels and the acceptance of several people. Real-time marketing, however, requires overcoming the organizational power of inertia – in this case it is better to react as quickly as possible or not at all.

]]>
Core Web Vitals https://www.paraphrase-online.com/blog/seo/core-web-vitals/ Mon, 25 Jan 2021 06:30:43 +0000 https://www.paraphrase-online.com/blog/?p=1106 Continue readingCore Web Vitals]]> The internet is constantly changing. The websites from its origins barely resemble those that accompany us today. Not only their appearance and the form of providing information have changed, but also our expectations. The differences also include the way the website is assessed – today it is not enough that it simply IS. In this context, we need to consider what Core Web Vitals are and what they actually tell us.

Core Web Vitals – website quality comes first

For a long time, Google has placed great emphasis on the speed and readability of the presented content. One of the first elements of these changes was certainly the shift towards users of mobiles devices (also known as “mobilesgeddon”), which in 2015 electrified SEOs and web developers. The next ones include the need to ensure the security of the website and its users, or SSL. Now it is turning to user experience.

In May 2021, page quality will become part of the Google algorithm and will have an impact on the site’s position in the search engine ranking. According to official sources, the quality of the website will not matter more than the content, but it will allow you to promote websites that convey them in a manner tailored to the user’s needs.

The quality of the website will be analyzed in terms of several factors mentioned earlier:
– adaptation to mobiles devices (mobiles-first indexing for all websites from March 2021);
– browsing safety – no harmful and misleading content, i.e. malware and phishing elements;
HTTPS;
– no full-screen ads that prevent access to website content.

In addition, there is also Core Web Vitals, known in USA as Basic Internet Indicators. The pointers are designed to help you gauge the user experience of page loading and availability during and after rendering.

Google emphasizes that work on the method of testing the quality of the website will be continued. In the coming years, the list of factors taken into account will certainly change and will be expanded with new points. We will be informed about the changes in advance, the search engine will also give us time to adapt our websites to the new requirements.

Basic Internet Indicators – let’s take a closer look at them

One of the more controversial seo-themes is website speed. In specialist groups, more than once or twice, we have witnessed long discussions about how a high PageSpeed Insights score improves a page’s position. It is true that the site’s rating in the tool does not translate directly into search engine rankings. It is known, however, that a website that has no problems with displaying its content correctly even with a weaker Internet connection will be welcomed warmly by users. This, in turn, will translate into the length of the visit, the willingness to share the link and the chance to achieve the conversion we set.

The research carried out by the tools I mentioned in the page speed article checks specific elements of the site, but does not provide full information about the experience of a less-one user. Their analysis is necessary, but it focuses on aspects of the website other than Basic Internet Indicators.

Core Web Vitals allow us to define the experiences that our website users experience. The data allowing to determine the website quality level is collected based on the actual visits to the website. Thanks to this, the behavior of the website is examined on a wide cross-section of cases – different speeds of network connections, sizes and technological capabilities of devices.

Core Web Vitals in the current version focuses on three elements:
– the largest rendering of the content,
– delay on first action,
– collective shift of the layout.

As I mentioned before, Google is planning to expand its website quality analysis and we can expect that this list will expand in the coming years. Google does not exclude the possibility of modifying current indicators, e.g. extending the scope of research.

LCP – Largest Contentful Paint – Largest content rendering

LCP determines the rendering time of the largest fragment of the page visible in the browser window (everything we see before scrolling). The elements that are taken into account are:
– graphics,
– video,
– an element containing background graphics, downloaded using the url (…) function in CSS,
– block elements (e.g. <p>, <div>, <ul>, <ol>, <hx>, etc.).

If the content of individual tags extends below the scroll line, these fragments are not taken into account in the analysis. In the case of graphics, the analysis is based on the smallest size reached by the element. If the graphic is rendered smaller than the values specified in the tag during rendering, its rendered size will be crucial for the analysis. When the graphic is stretched, the size of the sample taken for testing will depend on the values specified in the CSS.

How do you know which item is the largest? Web pages render in stages, and the HTML code is read in order – from the first line to the last. Therefore, as your site renders, the object marked “largest” will change. A given fragment of the page is defined as the “greatest rendering of content” only when it is fully loaded. Note – If the user moves the screen while rendering the page, further changes to “largest contentful paint” will not be taken into account.

What about items that load off-screen and only later appear within the user’s view? In most cases, these extracts will not be reported. In the opposite situation – when they render within the viewport and are pushed below the scroll line, they will still be taken into account and may be included in the final result.

Highest Content Render – Score

In PageSpeed Insight, values are provided in two ways:
– as a percentage, collected on the basis of data collected from users in the last 28 days. LCP was satisfactory in 39% of the cases,
– in seconds, based on tests performed during the page load simulation.

PageSpeed Insights analyzes data on the specified address. For more detailed information about pages within the entire domain, you may want to check out the Google Search Console and one of the newest tabs labeled “Basic Internet Indicators”. There are two reports available on the page displayed on mobiles devices and on computers.

The report in GSC lists the pages where there is a problem with poor LCP results. In a situation where the tool has identified the same problem on many subpages – the table contains one address and an annotation about the number of pages on which the error repeats. The exact addresses are available in the extended view, available after clicking on the record.

Poor LCP Score – Effective Ways to Improve

The key to a good LCP score is the optimization of points that slow down the performance of the website. Slow server, the need to execute JS and CSS code, which blocks further rendering of the site and client-side rendering – these are the most common causes of poor results. Let’s take a look at a remedy for the most common problems:
– slow response from the server – the key value here will be TTFB – Time To First Byte (time from sending the query to receiving the first byte of the response). Slow response is reported e.g. in PageSpeed Insights, it can also be checked with other website speed analysis tools.

The primary way to improve the TTFB score is to optimize the server and eliminate processes that degrade its performance. For this purpose, it is worth looking at the advice of your hosting provider or using the help of a specialist.

– JavaScript and CSS rendering blocking – here are the tricks that are recommended for every website speed optimization – CSS minification, removing unused fragments and considering asynchronous loading of style files. In terms of JavaSript, the optimization will be similar – minification of JS files and removal of unused fragments.

Optimization of other elements, e.g. graphics compression, use of CDN and prioritization of loading certain resources, may also affect the improvement of the LCP score.

FID – First Input Delay – Delay on first input

FID refers to the interactivity of the website – it defines the time between the first action performed by the user and the moment when the browser starts handling this action. Clicks (in the case of mobiles – taps) and button presses are taken into account. The FID score focuses on the analysis of response time, but does not examine the processing time of a user-initiated event.

The lag on the first action will not always be measured – some users choose not to interact with the page while it is loading. Interestingly, in the case of this indicator, the analysis takes place only during the actual use of the website – similar studies are not carried out during the simulation performed by Lighthouse.

Every Internet user has certainly encountered a situation in which he immediately wanted to move to another place after entering the website. However, clicking on the link or expanding the rest of the text was not possible immediately – the site’s reaction to our traffic was significantly delayed. Why is this happening? While the site is rendering, the browser is busy handling the files that make up the page. Only after completing this process is it able to respond to user behavior. The waiting time for the interactivity of the page translates directly into the subjective assessment by the user and determining it as “slow” or “fast”.

When describing the LCP, I mentioned that HTML is loaded line by line – so why is it not possible for the loaded fragments to be active earlier? Some tags, such as <input>, <a>, or <select>, require rendering of the main thread to be closed. As for the other obstacles – the <head> section of the .html file usually contains links to CSS and JS resources that the browser must parse before continuing to render the page. Hence the previous suggestions to minify these files (i.e. remove whitespace) and move to the rest of the code fragments unnecessary for rendering the first view. In the case of events that are registered by “event listeners” placed in JS, their operation is possible only after all the code has been executed.

Delay on first action – result

A satisfactory result is in the range of 0-100ms. Above this time it is worth introducing corrections, and above 300 ms – you have to.

The information from PageSpeed Insight is based on user input – in this case 82% of the cases are satisfactory.

More detailed data, along with data on specific addresses that need improvement, are included in the “Basic Internet Indicators” tab in Google Search Console.

Poor FID score – effective ways to improve

Ideally, the tools would report 95-99% of “green responses”. If our current results are not that high yet, making changes to the site will certainly do them good.

As in the case of LCP, first of all, it is recommended to optimize the code necessary to load the key elements of the website and move fragments with a lower priority to the rest of the file. It is also worth removing unused fragments and minifying.

The main reason for low FID results is usually the JS code. Optimizing files and breaking individual fragments into smaller, asynchronously executed batches should significantly speed up the rendering process and reduce resource locking.

Code coming from outside the website will also affect the browser response speed – limiting it will reduce the number of processes performed by the browser.

CLS – Cumulative Layout Shift

The CLS score is the sum of the changes due to an unexpected page layout shift. The change counts when the element changes its position at the time of rendering. CLS does not count as new elements appearing on the page or resizing existing ones, as long as they do not cause other page components to move. Contrary to the indicators listed above, CLS collects data during the entire visit to the page, not in the first moments of its rendering.

Jumps in the page layout are created, among others by asynchronously loading resources or by dynamically adding additional elements to the DOM structure. Such situations are noticeable, for example, when using pages on which graphics are loaded without the declared size – matching the appropriate size to the screen size may cause a sudden change in the appearance of the page, which causes the user to be moved to another part of it.

The resources downloaded from external sources, built on the basis of values other than the target page, and personalized content, generated without proper formatting, may also be problematic here. Unexpected offset can also be caused by extended font rendering.

Not every shift is bad, depending on the page design, some user actions will cause changes from the original layout of the elements. It is important that this type of shift is directly related to the action performed.

CLS is based on data collected from users and allows you to determine the actual experience on the site, which is not always available under the conditions in which the website developer works.

Cumulative Layout Offset Result

A good CLS score is in the range of 0-0.1. A score above 0.25 means that you need to make changes and improve the experience of its users. It is assumed that the website is stable when at least 75% of users receive a score below 0.1.

Data on the shift can be found in PageSpeed Insights. The percentage score is based on an analysis of user data – in this case, 90% get a good score. The PSI also provides information collected under laboratory conditions. Detailed information on more subpages can be obtained from Google Search Console.

Poor CLS Score – Effective Ways to Improve

A better CLS score can be achieved by adhering to general website building standards. The precise description of graphics and videos plays a significant role here – adding to them attributes related to the size of the element (width, height and the aspect ratio associated with them) makes it easier for the browser to correctly arrange the elements during rendering, without the need to change while loading subsequent pieces of content. In the case of dynamic added graphics, it is recommended to use placeholders that reserve space for the element added later.

For some time now, PageSpeed Insights has recommended the use of a css font-display, which is responsible for keeping the font on the page. Another possibility is to load the font in advance using the preload attribute placed in the <link> element. Preload causes the selected resources to be downloaded before rendering the page. Thanks to this, the problem of downtime and waiting for all files from the first part of the queue to be loaded is eliminated later in the page display.

Basic Internet Indicators – UX Matters!

If you look at the solutions that will have a positive impact on the score of individual indicators, it is easy to see that they are not new. Many of them were already recommended by Google during the first versions of PageSpeed Insights. However, indicators and their entry into the search engine algorithm are a good reason to take a closer look at your websites and actually make changes that will make it easier to use the websites.

]]>