اكتب ما تود البحت عنه و اضغط Enter
ads
معذرة، فالصفحة التي تبحث عنها في هذه المدونة ليست متوفرة.

الخميس، 3 مايو 2018




How-to Get WordPress To Inbox Emails No Spam Box

My friend told me that he had to look into the Spam folder to pick up the Comment-Reply-Email from my blog.
wp-emails-spam-folder How to avoid Wordpress Emails going to Spam Folder? email smtp wordpress
wp-emails-spam-folder
The problem is that the From-Email address is the VPS and it could be marked spam as many hosts are sending lots of emails per day (imaging lots of other VPS users). You can set up your own Email server but that involves too much technical work.

CLICK TO TWEET
You would need the SMTP plugins to direct all WP emails using SMTP settings which can be found: https://wordpress.org/plugins/wp-mail-smtp/
wp-email-settings How to avoid Wordpress Emails going to Spam Folder? email smtp wordpress
wp-email-settings
The following is a handy list of configurations that you would need if you use GMAIL. If you let WP send emails using you GMAIL SMTP, you would find all the outgoing emails (as backup) in the sent folder, which is really nice.
Gmail SMTP port (TLS): 587
Gmail SMTP port (SSL): 465
Gmail SMTP TLS/SSL required: yes

Simple as that, problem solved.



How to Fix the WordPress Not Sending Emails Issue

When we open mail.log we can see so many undelivered, delay mail status, some time server security will break and hackers sent bulk Spam mails from our servers. we can find it using terminal.
UN-authorized word-press themes, plug-ins, server security risks , permission faults,file upload validation are occur this errors and hacker can inject codes to our website

Find spam script using mailq

  • Switch to a user with sudo rights
  • Check the mail queue with command mailq
  • The first column of the mail queue list shows unique mail ID’s, pick one from an obvious spam email and copy it
  • Check this email’s details with command postcat -q <ID> using the unique mail ID you copied in place of <ID>
  • Identify the line starting with “X-PHP-Originating-Script”. This should show which script is generating the spam emails
  • Empty the mail queue with command postsuper -d ALL
  • Check the mail queue again with command mailq

Remove spam script using SED command

‘sed’ which stands for ‘stream editor’. Whenever you want to modify any text, any string sed always comes handy.so we can use SED to remove injunction code from wordpress site, this particular code check each and every php files and remove injected code
In our case injected code starting like this <?php $zdwyyyta and script middle part is $gyeemsq then ends with $hrhaior-1; ?>
So open your terminal and go to the injected website, example domain.com/public-html the run particular command
find . -type f -print0 | xargs -0 sed -ri ‘1 s/.*<\?php \$zdwyyyta.*\$gyeemsq.*\$hrhaior-1; \?>//g’ *.php
It will search particular code in all php files and replace with white space.so injected code is cleaned from your website and files
Again check the mail queue again with command mailq to see if more emails are now generated. If the problem persists, repeat the above steps and see if you find other scripts causing you problems.

spam script sent mails again, what is next ?

This time we want to find particular  mail script in wordpress website or any other php files, we need to find php.ini  file
if your using php 7 , php.ini location is /etc/php/7.0/apache2/php.ini, it will change per version and Linux distro so please check INTERNET or visit www.php.net
  • Open php.ini file
  • Add following line mail.add_x_header = On
  • Also add mail.log = /var/log/phpmail.log
  • Then create a file in log directory touch /var/log/phpmail.log
  • Give permission chown httpd:httpd /var/log/phpmail.log or chmod 777 /var/log/phpmail.log
  • restart your server, example sudo service apache2 restart
  • Check the login file nano /var/log/phpmail.log
Then you can see which script sent spam mail example
[18-Jan-2018 17:58:35 UTC] mail() on [/home/yourdomain.com/public_html/wp-includes/hack.php:698]: To: mail@domain.com — Headers: Date: Thu, 18 Jan 2018 17:58:35 +0000 From: xxx xxx
remove this particular file or script, it will solve your issue

PHP mailer script sending spam from WordPress

what happend when you see class-phpmailer.php sent spam script ?
[18-Jan-2018 17:58:35 UTC] mail() on [/home/yourdomain.com/public_html/wp-includes/class-phpmailer.php:698]: To: mail@domain.com — Headers: Date: Thu, 18 Jan 2018 17:58:35 +0000 From: xxx xxx
it does not send the spam by itself,it is being triggered probably by spamhackbots, misusing some vulnerable extension, or leftover backdoor malware scripts, or both
We alts solution Suggest you to install WordFence or Better WordPress security plugin to find vulnerabilities

At last you found it but, delete or uninstall the plug-in or theme,  might loose the client data and files which they have uploaded. so please backup and check it carefully .
We believe that this article has helped you with the information on WordPress security. If you have any doubt regarding this topic, please make sure to comment, the professional techies at  Alts solution are always happy to help you. Alts solution is one of the top most Digital marketing and App Development company. We offer high-quality service in Web Design and development, SEO, Web Hosting, App Development and Social Media Marketing,Online Reputation Management. We are one of the top Online Reputation Management in India.

الأحد، 29 أبريل 2018




Duplicate content can seriously harm your site, so we’ve put together our favorite free duplicate content checkers or plagiarism checking tools for your use.
Choosing to plagiarize content is a risky strategy. Along with losing the respect of their peers, plagiarists have lost their degrees, been fired from their jobsscuttled their political careers, not to mention dealing with legal repercussions. So, if plagiarism is regarded as a detestable practice in the offline world, why do people assume that duplicating content in the online world is acceptable? In fact, online duplicate content is a HUGE mistake!

Why You Should Take Advantage Of Duplicate Content Checkers

Search engines want to provide valuable, original content, so they regard plagiarism as a threat to their users’ experience. When a search engine indexes a web page, it scans the page’s content and then compares the content with other indexed websites. If a page is found to have plagiarized content, search engines often will penalize the page by lowering its rankings or removing it entirely from search results. Considering the serious penalties that your site can be landed with if it has plagiarized content, it is highly advisable that you check your existing web content, and any content you plan on publishing, for duplication.

The Best Free Plagiarism Checker Tools For Your Web Content

Even if you are confident your website’s content wasn’t plagiarized, it’s recommended you check to make sure nothing was unintentionally duplicated. To help you complete this task (and ensure that your site’s rankings stay healthy and unpenalized) here are our favorite 4 free duplicate content checker tools:
duplichecker free Duplicate Content checker search
This free plagiarism checker tool allows you to conduct text searches, DocX or Text file, and URL searches. It’s free with unlimited searches when you register (you’re allowed 1 free search before signing up). Scan for duplication was completed in just a few seconds (of course this will depend on the length of the text, page, or file you’re scanning). It’s simple, free, and effective!
2. SITELINER
Free Duplicate Content Checker Resource from Siteliner
For checking entire websites for duplicate content, there is SiteLiner. Simply paste your site’s URL in the box and it will scan for duplicate content, page load time, the number of words per page, internal and external links, and much more. Depending on the size of your site the scan can take a few minutes, but the results are well worth the wait. Once the scan is complete you can click on the results for greater details and even download a report of the scan as a PDF.
Note: The free SiteLiner service is limited to one scan, per site, per month, but the SiteLiner premium service is very affordable (each page scanned only costs 1c, and you can scan as often as you wish).
3. PLAGSPOTTER
PlagSpotter - Plagiarism checking tool
The PlagSpotter URL search is free, quick, and thorough. Scanning a web page for duplicate content took just under a minute with 49 sources listed, including links to those sources for further review. There is also an “Originality” feature that allows you to compare text that has been flagged as duplicated. While PlagSpotter’s URL search is free, you can sign up for their no-cost 7-day trial to enjoy a plethora of useful features, including plagiarism monitoring, unlimited searches, batch searches, full site scans, and much more. If you wish to continue using PlagSpotter after the free trial, the paid version is extremely affordable.
4. COPYSCAPE
duplicate content checker for free from copyscape
CopyScape offers a free URL search, with results coming in just a few seconds. While the free version doesn’t do deep searches (breaking down the text in order to search for partial duplication) it does a thorough job of finding exact matches. If you have found two URLs or text blocks that appear similar, Copyscape has a free comparison tool that will highlight duplicate content in the text. While there is a limited number of searches per site with their free service, CopyScape’s Premium (paid) account allows you to have unlimited searches, deep searches, search text excerpts, search full sites, and monthly monitoring of plagiarism.

Notable Copy Checking Mentions:

Update! When we originally wrote this in 2014, there were very few plagiarism or duplicate content checking tools on the market. The list has expanded dramatically, and now include many new options, including the following honorable mentions:

Now you know our duplicate content tool recommendations – have any of your own?

We hope that the resources we’ve listed above will help you write quality web content without worrying that your website or blog will be penalized for duplicate content. If you’ve already been using a duplicate content checker for your website or blog, we’d love for you to share your own recommendations or experience in the comments below. If you’d like to learn more about content writing and how it can benefit your site, contact us and we’ll help you devise an effective strategy for your site.


Google's got a blog post out today (and SELand covers it) about how they now recommend that webmasters and site owners DO NOT rewrite their ugly dynamic URLs to be clean and static. What's the reasoning behind this?
We've come across many webmasters who, like our friend, believed that static or static-looking URLs were an advantage for indexing and ranking their sites. This is based on the presumption that search engines have issues with crawling and analyzing URLs that include session IDs or source trackers. However, as a matter of fact, we at Google have made some progress in both areas. While static URLs might have a slight advantage in terms of clickthrough rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking. Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.
The fundamental problem here is that Google is thinking about this from a completely different perspective as marketers. It's not that they're wrong or lying or creating misinformation, it's just that they're looking out for their best interests - effectively and efficiently crawling the web and serving up accurate data about the contents of pages. When URL rewrites go awry, it can screw up Google's ability to return the results their users want (and as content publishers, the results you want).
However, the fact that some developers incorrectly create rewrite rules does not mean that sticking with dynamic parameters is now the "best practice." It simply means you have to do it right.
Let's go over the list of pros and cons for static vs. dynamic URLs and see what's really changed:
Pros of Dynamic URLs
  • Umm... they're usually longer?
  • Google (1 of the 4 major search engines) says they can effectively crawl and index them
Cons of Dynamic URLs
  • Lower click-through rate in the search results, in emails, and on forums/blogs where they're cut and pasted
  • A greater chance of cutting off the end of the URL, resulting in a 404 or other error when copying/pasting
  • Lower keyword relevance and keyword prominence
  • Nearly impossible to write down manually and share on a business card or read over the phone to a person
  • Challenging (if not impossible) to manually remember
  • Does not typically create an accurate expectation of what the user will see prior to reaching the page
  • Not usable in branding or print campaigns
  • Won't typically carry optimized anchor text when used as the link text (which happens frequently due to copying & pasting)
Pros of Static URLs (mostly the opposites of the above)
  • Higher click-through rates in the SERPs, emails, web pages, etc.
  • Higher keyword prominence and relevancy
  • Easier to copy, paste and share on or offline
  • Easy to remember and thus, usable in branding and offline media
  • Creates an accurate expectation from users of what they're about to see on the page
  • Can be made to contain good anchor text to help the page rank higher when linked-to directly in URL format
  • All 4 of the major search engines (and plenty of minor engines) generally handle static URLs more easily than dynamic ones, particularly if there are multiple parameters
Cons of Statics URLs
  • You might mess up the rewriting process, in which case your users and search engines will struggle to find content properly on your site.
So - bottom line - dynamic URLs don't afford you the same opportunity for search engine rankings, usability or portability that rewritten, keyword-optimized URLs do. Just because one of the engines doesn't have trouble crawling them doesn't mean it's any less critical to continue optimizing this element of a site's structure.
If you buy into Google's argument that because rewriting URLs can occassionally cause problems (nevermind that we've done it at SEOmoz and with our clients dozens of times without issues), you're setting yourself up for something significantly less than search engine "optimization." I'd be tempted to call it conservative SEO, but it's not really even that. It's the mindset that living in fear of change rather than pursuing the best course of action is the better choice, and none of us who do SEO for a living should support that mentality.


There are plenty of websites on internet today. We cannot immediately tell whether a website is static or dynamic. But some of the websites look simple and are for small business without whistles and bells they are actually static websites. Static websites can only be updated manually by someone with knowledge of development. Static websites are cheapest to develop and host this is why many smaller companies still use static website for their web presence.
Let us take a look at Pros and Cons of Static Website:
Pros:
  1. Easy and Quick to Develop
  2. Cost Effective Development
  3. Cost Effective Hosting
Cons:
  1. Need to have strong web development knowledge to update the website
  2. Not much useful for users
  3. Content can get stagnant

Dynamic websites costs you more when compared to static websites but gives website owner the ability to simply update the website with ease. Dynamic website includes CMS like WordPress development , Joomla development and much more that are very easy to access and understand. One can easily upload content and document on to site when needed.
Pros:
  1. Functional Websites
  2. Easier to Update
  3. Content freshness attracts visitors towards the website
  4. Team and users can collaborate and work together
Cons:
  1. Expensive To Develop
  2. Hosting Costs Little More
As the advantages of dynamic sites are getting a limelight the more and more people are now opting of dynamics websites.  Dynamic websites can make the most of your site and either use it as a tool or create a professional, interesting experience for your visitors. I hope you have found the blog useful. Comment you suggestion if any. Your suggestions are always welcome.




You may have heard the words static and dynamic websites thrown around when people talk
 about websites, but still not really know what they mean, or even how they're different. By the end of this article you should have a firm grasp on the biggest differences between the two and hopefully, even feel confident explaining it to someone else. It all goes back to the idea that there are websites and web applications. A web application is a website, but a lot of websites can't be a web application. For example, Facebook is a website, but it's also a web application. A hometown cupcake place's simple site, like Cupcakes To Go Go, is website but not a web app. You'll often hear static sites called websites and dynamic sites called web apps.  

Static

A static site is the most basic kind of website, and the easiest to create. It requires no server-side processing, only client-side. Client-side technologies are HTMLCSS and JavaScript. In simpler terms, it requires no use of the back-end. A static website is delivered to a user exactly the way it's stored. That means that nothing on the page will change by the user or even the site administrator, unless there's a redesign of the site or the site administrator goes directly into the code to change it. Nothing is stored but the actual pages of the site. There are no users, no comments, no blog posts or any interactivity. No programming languages are required to make a static site. Technically JavaScript is a programming language, but it's not required to make a static site. However, if a site utilizes JavaScript, but no PHP or any other programming language it's still considered a static site, since JavaScript is a client-side language.  
Image of fanned out brochures
Creative Commons image by Flikr user Antonio Bonanno

Static web pages are made of “fixed code” and unless the site developer makes changes, nothing will change on the page. Think of it like a brochure for a business. The brochure can’t just change itself –  to change it someone has to create a new one. That’s why static websites are sometimes referred to as brochure sites, since they give you a lot of the same information that you could get from a brochure.
So, if you need to make a site that needs to just get information out there and not be updated regularly, creating a static site would be much simpler and probably effective for you.

Dynamic

There’s a simple way to determine if a site is dynamic. If you can interact with it, it’s a dynamic site. So, most of the sites you probably visit are dynamic sites, be it RedditTwitter, Facebook or even Digital-Tutors. The way that you can interact with the site, and just clicking a link within the site doesn’t count, is more like commenting on a post, creating a user profile or making a reservation.
Examples of dynamic sites are blogs, e-commerce sites, calendar or to-do sites, or any site that needs to updated often.
Dynamic sites use languages like PHP to interact with information stored in a databases. For this reason, dynamic sites are much more complicated and expensive to create. Not only is web hosting required, but databases or serversmust be created as well. The languages used to create dynamic sites are also much more complicated than the client-side languages.

Image of servers at CERN
Creative Commons image by Flikr user Torkild Retvedt

Most dynamic sites utilize a Content Management System to, you guessed it, manage their content. Often, developers will create a custom CMS for their clients (using PHP and MySQL), but that’s not necessary. There are tons of free systems available for your use, like WordPressDrupal and Joomla.
You might hear that PHP and ASP.NET are used to generate HTML dynamically. That really just means that those programming languages can, with direction, change and write HTML without a person having to actually go into the code and change it.
Another term often associated with dynamic sites is CRUD, which stands for Create, Read, Update and Delete. All four of those things happen when you’re working with a dynamic site because they refer to functionality of a database. Think about a blog. In any blogging platform, that’s successful anyway, you have to have the ability to create content, then be able to read or view that content on the page. You should be able to update or edit your posts and then have the option to delete them as well. All that work happens in the database. Content management systems make this process possible.

Conclusion

As you begin your web design journey, you may find yourself creating a lot of static sites first, and that’s ok! It’s better to completely understand the client-side before you jump in to the server-side, anyway.



A Short answer is Not really. You need to check case by case and though you cannot be sure 100% that a website is dynamic or static.Nowadays websites are using more Jquery and Ajax.
Static Vs Dynamic:
Static websites: Websites contains fixed pages and formats. It display the exact same information whenever anyone visits it.
Dynamic websites: Website can change the web page contents dynamically based on client's browser.
You can also use browser extension or addon to check the site is static or dyanamic:
Wappalyzer:
I am using wappalyzer in my browser to know sites information. Wappalyzer is a browser extension that uncovers the technologies used on websites. It detects content management systems, eCommerce platforms. This is the easiest way i found to identify that site is static or dynamic.
Addon Link of mozilla :Wappalyzer
Extension of Chrome: Wappalyzer
All websites are dynamic, it’s just a question of when. Some change several times a day. Others change less often, once or twice a year maybe… sooner or later they'll most likely all change to ‘just a memory’.
They are also static. Even the most dynamic doesn't change at every request. Else reverse proxies and caches wouldn't make a lot of sense.
So look for the marks of a CMS or some kind of rendered page.
I used to write using a lot of SSI (Server Side Includes). My html files were made up of a header file, a footer file, some content from some program or else a index file using raw html. Most of the pages didn't exist until someone requested them. That’s about as dynamic as you can get.
I don't do anything outside of a CMS now. I specialize in WordPress.

There’s a browser extension called ‘Built With’ that will show you a lot about the final server and the technology used to present a web page. Things like reverse proxy boxen and cloud based networking can obscure some of this. My sites all run on APache but tell the world they're on Nginx since there’s CloudFlare’s reverse proxy running out front.
You'll learn a lot just running a setup with ‘Built With’ helping you ‘examine’ the rendering site.
Also you can hit CTRL-U to view the underlying code to get a view of that. It will still be obscured but as you look at different sites you'll start recognizing CMS systems and such.


Seeing the “parallelize downloads across hostnames” warning in Pingdom, GTmetrix, or Google PageSpeed Insights? This is because web browsers are limited to the number of concurrent connections they can make to a host. This is mainly due do HTTP/1.1 in which browsers open on average 6 connections per hostname. This warning is typically seen on websites with a large number of requests. In the past, the only way to get around this limitation is to implement what they call domain sharding.






Note: If you are running over HTTPS with a provider that supports HTTP/2, this warning can usually be safely ignored now. With HTTP/2 multiple resources can now be loaded in parallel over a single connection.
Depending upon the tool or software reporting it, the warning might appear in a couple different ways:
  • “parallelize downloads across hostnames”
  • “increase download parallelization by distributing these requests across multiple hostnames”
pingdom parallelize downloads across hostnames
If you are still running over HTTP and haven’t migrated to HTTP/2 yet, you can follow the tutorial below on how to implement domain sharding. Again, most of the techniques are now considered deprecated. Over 77% of browsers now support HTTP/2 when running over HTTPS, as well as many CDN and web hosting providers, including Kinsta. It is also important to note that Pingdom doesn’t support HTTP/2 yet since it uses an older version of Chrome.

Fix “Parallelize Downloads Across Hostnames” Warning

Domain sharding refers to spreading out your assets across multiple subdomains. By doing this you can multiply the number of simultaneous requests. Using domain sharding also gives you the ability to load content on cookie-free subdomains. However, it is also important to note that there are a couple drawbacks to this. By introducing additional subdomains you are adding more DNS requests which increases resolution times and you lose a lot of your caching benefits. Follow the steps below to set it up.

1. Setup Additional Subdomains

The first thing you will need to do is create additional subdomains and or CNAME records to spread across the request for your static assets. You can do this at your DNS registrar or if you are a Kinsta customer you can also edit your DNS records from within your My Kinsta dashboard. Typically no more than 4 are recommended. You will want to point your additional CNAMEs at your /wp-content directory. An example of a configuration might be:

domain.com
static1.domain.com
static2.domain.com

2. Edit WordPress Config

You then have to configure WordPress to parallelize the downloads across subdomains. Simply add the following code to your WordPress theme’s functions.php file (src: GitHub). And replace the $subdomains values with your subdomains. All subdomains/hostnames MUST have the same structure/path.

function parallelize_hostnames($url, $id) {

 $hostname = par_get_hostname($url); //call supplemental function

 $url = str_replace(parse_url(get_bloginfo('url'), PHP_URL_HOST), $hostname, $url);
 return $url;
}
function par_get_hostname($name) {
 $subdomains = array('media1.mydomain.com','media2.mydomain.com'); //add your subdomains here, as many as you want.
 $host = abs(crc32(basename($name)) % count($subdomains));
 $hostname = $subdomains[$host];
 return $hostname;
}
add_filter('wp_get_attachment_url', 'parallelize_hostnames', 10, 2);

 This same technique above can also be used with CDN providers such as KeyCDN, MaxCDN, and CloudFlare to fix the “parallelize downloads across hostnames” warning. However, almost all CDN providers now support HTTP/2 in which domain sharding is not recommended. And you can still serve assets from a CDN via HTTPS even if you haven’t migrated your WordPress site yet to HTTPS.

بث مباشر

البث المباشرة مع رغيب أمين للإجابة على تساؤلاتكم التقنية من الإثنين إلى الخميس

730

بتوقيت غرينيتش

2230

بتوقيت مكة المكرمة

المحترف: شروحات برامج مكتوبة ومصورة بالفيديو  | Almohtarif
المحترف عبارة عن مدونة ضخمة تضم عدد كبير من الفيديوهات المصورة عن طريق حلقات متسلسلة نتطرق فيها إلى مختلف المواضيع التقنية القريبة من الشباب العربي ، بالإضافة إلى مقالات . مدونة المحترف تأسست سنة 2010 حيث تستقطب الآن عدد كبير من الزوار من كافة ربوع الوطن العربي ، حيث ان مقرها الرئيسي بالمغرب و مديرها امين رغيب ،حاصلة على جائزة افضل مدونة مغربية لسنة 2012 / 2013 ومصنفة ضمن افضل 10 مدونات عربية حسب المركز الدولي للصحفيين ICFJ سنة 2013 وحاصلة على الجائزة الفضية من يوتوب (اول قناة مغربية تحصل على هذه الجائزة من يوتوب ) سنة 2014 وتم تكريم مؤسسها من طرف حاكم دبي بجائزة رواد التواصل الإجتماعي عن فئة التكنولوجيا سنة 2015 وهو الموقع التقني الاول في المغرب والعالم العربي
للإتصال بنا: contact@almohtarif.net
جميع الحقوق محفوظة ل الربح من الانترنت | Make Money
صمم وكود بكل من طرف