SitesAdd SEO analyzer

ibeas.org.br

 IBEAS - Instituto Brasileiro de Estudos Ambientais - HOME

Last update 3 seconds ago

SEO

Title  IBEAS - Instituto Brasileiro de Estudos Ambientais - HOME
Titles are critical to giving users a quick insight into the content of a result and why it’s relevant to their query. It's often the primary piece of information used to decide which result to click on, so it's important to use high-quality titles on your web pages.

Here are a few tips for managing your titles:
  • Make sure every page on your site has a title specified in the <title> tag. If you’ve got a large site and are concerned you may have forgotten a title somewhere, you may also check the HTML suggestions page in Search Console lists missing or potentially problematic <title> tags on your site.
  • Page titles should be descriptive and concise. Avoid vague descriptors like "Home" for your home page, or "Profile" for a specific person's profile. Also avoid unnecessarily long or verbose titles, which are likely to get truncated when they show up in the search results.
  • Avoid keyword stuffing. It's sometimes helpful to have a few descriptive terms in the title, but there’s no reason to have the same words or phrases appear multiple times. A title like "Foobar, foo bar, foobars, foo bars" doesn't help the user, and this kind of keyword stuffing can make your results look spammy to Google and to users.
  • Avoid repeated or boilerplate titles. It’s important to have distinct, descriptive titles for each page on your site. Titling every page on a commerce site "Cheap products for sale", for example, makes it impossible for users to distinguish one page differs another. Long titles that vary by only a single piece of information ("boilerplate" titles) are also bad; for example, a standardized title like "<band name> - See videos, lyrics, posters, albums, reviews and concerts" contains a lot of uninformative text. One solution is to dynamically update the title to better reflect the actual content of the page: for example, include the words "video", "lyrics", etc., only if that particular page contains video or lyrics. Another option is to just use " " as a concise title and use the meta description (see below) to describe your site's content.
  • Brand your titles, but concisely. The title of your site’s home page is a reasonable place to include some additional information about your site—for instance, "ExampleSocialSite, a place for people to meet and mingle." But displaying that text in the title of every single page on your site hurts readability and will look particularly repetitive if several pages from your site are returned for the same query. In this case, consider including just your site name at the beginning or end of each page title, separated from the rest of the title with a delimiter such as a hyphen, colon, or pipe, like this:

    <title>ExampleSocialSite: Sign up for a new account.</title>

  • Be careful about disallowing search engines from crawling your pages. Using the robots.txt protocol on your site can stop Google from crawling your pages, but it may not always prevent them from being indexed. For example, Google may index your page if we discover it by following a link from someone else's site. To display it in search results, Google will need to display a title of some kind and because we won't have access to any of your page content, we will rely on off-page content such as anchor text from other sites. (To truly block a URL from being indexed, you can use meta tags.)
Title length 58 signs (Recomended: 35-65 signs)
Description none
The description attribute within the <meta> tag is a good way to provide a concise, human-readable summary of each page’s content. Google will sometimes use the meta description of a page in search results snippets, if we think it gives users a more accurate description than would be possible purely from the on-page content. Accurate meta descriptions can help improve your clickthrough; here are some guidelines for properly using the meta description.
  • Make sure that every page on your site has a meta description. The HTML suggestions page in Search Console lists pages where Google has detected missing or problematic meta descriptions.
  • Differentiate the descriptions for different pages. Identical or similar descriptions on every page of a site aren't helpful when individual pages appear in the web results. In these cases we're less likely to display the boilerplate text. Wherever possible, create descriptions that accurately describe the specific page. Use site-level descriptions on the main home page or other aggregation pages, and use page-level descriptions everywhere else. If you don't have time to create a description for every single page, try to prioritize your content: At the very least, create a description for the critical URLs like your home page and popular pages.
  • Include clearly tagged facts in the description. The meta description doesn't just have to be in sentence format; it's also a great place to include structured data about the page. For example, news or blog postings can list the author, date of publication, or byline information. This can give potential visitors very relevant information that might not be displayed in the snippet otherwise. Similarly, product pages might have the key bits of information—price, age, manufacturer—scattered throughout a page. A good meta description can bring all this data together.
  • Programmatically generate descriptions. For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions can be impossible. In the latter case, however, programmatic generation of the descriptions can be appropriate and are encouraged. Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation. Keep in mind that meta descriptions comprised of long strings of keywords don't give users a clear idea of the page's content, and are less likely to be displayed in place of a regular snippet.
  • Use quality descriptions. Finally, make sure your descriptions are truly descriptive. Because the meta descriptions aren't displayed in the pages the user sees, it's easy to let this content slide. But high-quality descriptions can be displayed in Google's search results, and can go a long way to improving the quality and quantity of your search traffic.
Description length 0 signs (Recomended: 70-320 signs)
Keywords none
H1
Count of H1 tags Count of H1 tags: 0
H1 length signs (Recomended: 5-70 signs)
H1 equals Title H1 is not equals Title
Count all tags
H2: 1 H3: 0 H4: 0 H5: 0 H6: 0
Content length signs 1480 (Recomended length: more than 500 signs)
Content to code ratio Content to code ratio: 7% (Recomended ratio: more than 10%)

Domain information

Alexa rank 427886
Domain register date 1970-01-01 00:00:00.000000
Registry expire date 1970-01-01 00:00:00.000000
Info

IP information

IP 192.185.222.77
Country United States
IP city Houston
ISP Websitewelcome.com
Organization CyrusOne LLC
Blacklist none

Indexation

<noindex> (Yandex directive) Content in noindex tags not found
URL length 13 symbols.(Recomended url length limitation: 115 symbols)
Protocol redirect HTTP to HTTPS redirect not working
HTTPS (Hypertext Transfer Protocol Secure) is an internet communication protocol that protects the integrity and confidentiality of data between the user's computer and the site. Users expect a secure and private online experience when using a website. Google encourages you to adopt HTTPS in order to protect your users' connection to your website, regardless of the content on the site.

Data sent using HTTPS is secured via Transport Layer Security protocol (TLS), which provides three key layers of protection:
  • Encryption—encrypting the exchanged data to keep it secure from eavesdroppers. That means that while the user is browsing a website, nobody can "listen" to their conversations, track their activities across multiple pages, or steal their information.
  • Data integrity—data cannot be modified or corrupted during transfer, intentionally or otherwise, without being detected.
  • Authentication—proves that your users communicate with the intended website. It protects against man-in-the-middle attacks and builds user trust, which translates into other business benefits.


If you migrate your site from HTTP to HTTPS, Google treats this as a site move with a URL change. This can temporarily affect some of your traffic numbers.
Add the HTTPS property to Search Console; Search Console treats HTTP and HTTPS separately; data for these properties is not shared in Search Console. So if you have pages in both protocols, you must have a separate Search Console property for each one.
404 Page 404 - Correct response
Robots.txt not found
A robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. The file uses the Robots Exclusion Standard, which is a protocol with a small set of commands that can be used to indicate access to your site by section and by specific kinds of web crawlers (such as mobile crawlers vs desktop crawlers).

The simplest robots.txt file uses two key words, User-agent and Disallow. User-agents are search engine robots (or web crawler software); most user-agents are listed in the Web Robots Database. Disallow is a command for the user-agent that tells it not to access a particular URL. On the other hand, to give Google access to a particular URL that is a child directory in a disallowed parent directory, then you can use a third key word, Allow.

Google uses several user-agents, such as Googlebot for Google Search and Googlebot-Image for Google Image Search. Most Google user-agents follow the rules you set up for Googlebot, but you can override this option and make specific rules for only certain Google user-agents as well.

The syntax for using the keywords is as follows:

User-agent: [the name of the robot the following rule applies to]

Disallow: [the URL path you want to block] Allow: [the URL path in of a subdirectory, within a blocked parent directory, that you want to unblock]

These two lines are together considered a single entry in the file, where the Disallow rule only applies to the user-agent(s) specified above it. You can include as many entries as you want, and multiple Disallow lines can apply to multiple user-agents, all in one entry. You can set the User-agent command to apply to all web crawlers by listing an asterisk (*) as in the example below:

User-agent: *

You must apply the following saving conventions so that Googlebot and other web crawlers can find and identify your robots.txt file:
  • You must save your robots.txt code as a text file,
  • You must place the file in the highest-level directory of your site (or the root of your domain), and
  • The robots.txt file must be named robots.txt

As an example, a robots.txt file saved at the root of example.com, at the URL address http://www.example.com/robots.txt, can be discovered by web crawlers, but a robots.txt file at http://www.example.com/not_root/robots.txt cannot be found by any web crawler.
SiteMap.xml ok
A sitemap is a file where you can list the web pages of your site to tell Google and other search engines about the organization of your site content. Search engine web crawlers like Googlebot read this file to more intelligently crawl your site.

Also, your sitemap can provide valuable metadata associated with the pages you list in that sitemap: Metadata is information about a webpage, such as when the page was last updated, how often the page is changed, and the importance of the page relative to other URLs in the site.

You can use a sitemap to provide Google with metadata about specific types of content on your pages, including video and image content. For example, you can give Google the information about video and image content:

A sitemap video entry can specify the video running time, category, and age appropriateness rating.
A sitemap image entry can include the image subject matter, type, and license.

Build and submit a sitemap:
  • Decide which pages on your site should be crawled by Google, and determine the canonical version of each page.
  • Decide which sitemap format you want to use. You can create your sitemap manually or choose from a number of third-party tools to generate your sitemap for you.
  • Test your sitemap using the Search Console Sitemaps testing tool.
  • Make your sitemap available to Google by adding it to your robots.txt file and submitting it to Search Console.

Images

Images without description
title alt url
none none /uploads/5/7/9/6/57960227/1439000581.png
none none /Conresol-logo-horizontal.jpg
none none /Logo%20IX%20CONGEA.jpg
none none /Cursos2018.png
none none /Socio%20Colaborador.jpg
none none /Loja%20Virtual.jpg
none none /Anais%202010%202017.png
none none /Cursos%20em%20Andamento.jpg
none none /Cursos%20Curta%20Duracao.jpg
none none /Sistema%20Ambiente.jpg
none Picture /uploads/5/7/9/6/57960227/banner-credenciamento-enersolar-150x70_orig.png
none Picture /uploads/5/7/9/6/57960227/3702742.jpg
none Picture /Cienciaeclima.jpg
none Picture /uploads/5/7/9/6/57960227/7269362.gif
none Picture /uploads/5/7/9/6/57960227/2155413.gif?84
none Picture /uploads/5/7/9/6/57960227/2911017.jpg?100
none Picture /uploads/5/7/9/6/57960227/9362789.jpg?99
The alt attribute is used to describe the contents of an image file.

It provides Google with useful information about the subject matter of the image. Google uses this information to help determine the best image to return for a user's query. Many people-for example, users with visual impairments, or people using screen readers or who have low-bandwidth connections-may not be able to see images on web pages. Descriptive alt text provides these users with important information.

Not so good:
<img src="puppy.jpg" alt=""/>

Better:
<img src="puppy.jpg" alt="puppy"/>

Best:
<img src="puppy.jpg" alt="Dalmatian puppy playing fetch">

To be avoided:
<img src="puppy.jpg" alt="puppy dog baby dog pup pups puppies doggies pups litter puppies dog retriever labrador wolfhound setter pointer puppy jack russell terrier puppies dog food cheap dogfood puppy food"/>

Filling alt attributes with keywords ("keyword stuffing") results in a negative user experience, and may cause your site to be perceived as spam. Instead, focus on creating useful, information-rich content that uses keywords appropriately and in context.

Links

External Links

Qty Anchors URL
1 //facebook.com/ibeas.ambiental
2 1o. Congresso Sul-Americanode Resíduos Sólidos e Sustentabilidade http://www.ibeas.org.br/conresol1
2 IX Congresso Brasileiro de Gestão Ambiental http://www.ibeas.org.br/congresso9
2 Inscrição turmas 2018 http://www.ibeas.org.br/cega-ufscar.htm
2 Sócio Colaborador http://www.ibeas.org.br/socio.htm
2 Loja Virtual https://www.lojavirtual.ibeas.org.br
2 Anais Congressos http://www.ibeas.org.br/congresso/anais.htm
2 Cursos em Andamento http://www.ibeas.org.br/CursosAndamento.htm
2 Cursos de Curta Duração http://www.ibeas.org.br/CursosCurtaDuracao.htm
2 Sistema Ambiente http://www.ibeas.org.br/Digitalis.htm
1 http://goo.gl/EJI5bA
1 http://www.blobel.com.br
1 http://www.cienciaeclima.com.br
1 Clique aqui e conheça a Filosofia Institucional do IBEAS: Missão, Visão, Princípios e Política. http://www.ibeas.org.br/Filosofia%20Institucional%20do%20IBEAS.htm
1 http://www.ufscar.br
1 http://www.aeaarp.org.br
1 http://www.assenag.org.br
1 http://www.ciespcampinas.org.br

Internal Links

Qty Anchors URL
2 HOME http://ibeas.org.br/
1 SOBRE http://ibeas.org.br/sobre.html
1 CONTATO http://ibeas.org.br/contato2.htm