vendredi 5 décembre 2014

Reduce 90% outgoing traffic


I am changing my VPS to a instance on Amazon EC2. The thing is: I was choosing my AMI but i saw a custom AMI that promises to reduce outgoing traffic by 90% but I lost the link. It seems to be scam?


Someone here know about this? By the way I would like to ask some hints to reduce traffic.


I really need change and amazon charges for outgoing traffic (even if you are a new customer with free tier elegible, just 15GB free outgoing traffic...)


Now my VPS:


1 core 1GHZ 1.300MB RAM 60GB Disk Space 1TB outgoing traffic


And I pay $58.00, nothing cheap.





Creating a Global FTP access with Local ADSL IP [on hold]


I'm wondering if there is a way i can create my own ftp with global access, i have a local ftp which i wanna make it accessible from out of my network. what steps should i follow (i dont have static ip). i want to know should i buy a static ip or is there other solutions? many thanks





Hosting solutions for image uploads. [duplicate]



This question already has an answer here:




I'm developing an application which will primarily deal in image uploads. Since the RESTful API is written in PHP I've been looking at options similar to fortrabbit.com for hosting the app itself. Most hosting solutions which provide SSH, git and composer support do not include much actual file space. Since I haven't dealt with much web administration I'm not sure how this is usually solved (besides of course large companies which run their own servers). Would I use a separate solution such as rackspace for the actual image hosting? If so, what would the speed look like writing this data over CURL etc? Sorry for my noobishness, this is far from my area of expertise. :D





Will a Cloned Site Get Indexed Without Links to Inner Pages


I've cloned a site to a new domain just to use the functionality of the site. Everything will be different on the new domain, but there are a lot of left over files and you can access URLs that are identical to the original domain.


Can these URLs, on the new domain, get indexed in Google and cause duplicate content problems? Even if they cannot get indexed, is it best practice to delete them?





pagePath and nextPagePath in analytics api not working


I try to get a list of pages people go after visiting the selected page. pagePath is a starting page, nextPagePath is the result list of pages I'm interested in. I included a filter to show only one starting page. But the results I get are confusing:


google query explorer page path


What am I doing wrong?!





Compare values for a category in Google Analytics event tracking


I have event tracking setup on our Where To Buy form where the users enter the zip code they want to search for dealers.


The event value is the zip code entered.


What I would like to see is a graph of entered zip codes but so far I have only been able to see the overall performance of the event category and not the data I was hoping to see.


Is there a way to make the event report display a graph of the values for an event category? Or do I need to change this event tracking to make the zip code the category?





What is the most suitable instant payment system for my website? [on hold]


Iam currently working on website and I want to integrate payment system. As you can imagine there is a lot of similiar services (PayPal, Stripe, Google Wallet, Dwella, etc. etc. etc.) and I have really no expereince of using them. So I would appreciate any recomendation what will satisfy me needs:



  1. Customer will be able to pay by credit card, without need of creating some account.

  2. Money will be instanly transfered to websites account, after payment for service (I dont want to wait multiple days for transfer - like between banks)

  3. When service is cancelled or customers needs are not satisfied, money will be returned to customer

  4. If service is done, money will be split between bank account of service provider and bank account of website.


So I basically want that the customer fistly books money (somehow) for service hee wants and after service is done the money are actually paid.


Do you have any ideas please?





Please help with this code. Emergency!


Does not work can you find what I did? Please help!


{phrase var='core.search'} {if Phpfox::isUser()}

  • {$aUserSideMenu.full_name|shorten:15:'...'}


View Profile


{/if}


{if Phpfox::isUser()} 0{phrase var='data.mobile_messages'}

Mail From Fans


{if Phpfox::getParam('data.follow')} {phrase var='data.following'}({$aUserfollowing})

Stars You Follow


{phrase var='data.followers'}({$aUserfollowers})

Your Fans


{phrase var='data.favorites'}({$aUserfavorites})

Stuff You Like


Media Photos

Stuff You Like


Videos

Stuff You Like


Music

Stuff You Like


{*if Phpfox::isUser()} {module name='data.trending_topics_mobile'} {/if*} Options {if Phpfox::isUser()} {phrase var='data.mobile_settings'}

Change Is A Good Thing


{phrase var='data.mobile_privacy'}

Shhhhh..


{phrase var='data.mobile_terms'}

But It's The Terms Tho


{if Phpfox::isUser()} {phrase var='data.mobile_sign_out'}

Come Back Soon!


{/if


{/if}


{/if}





Identify correct XAMPP looking over XAMPP landing page only


I am in company premise and we have shared network LAN


my system IP is 192.68.0.x


my colleague's system IP is 192.168.0.y


Now


192.68.0.x/xampp and 192.68.0.y/xampp both opens up with the identical regular orange landing-page of XAMPP.


So my question is



Is there any way/approach by which I come to know the this is my very own XAMPP from the XAMPP landing page?



Yes there is changes in header if both have different system?


like XAMPP for windows or XAMPP for linux but rest page is identical..


and securing every XAMPP is not the solution.


we can customize that landing page color and design but again opening 192.168.0.x will appears as same over the LAN.


Does latest version of XAMPP can do this by programming?





Google showing 2 different descriptions for same domain


When searching on google 2 keywords from my site, for example: "lorem ipsum" and "loremipsum" I get 2 different descriptions on google.


"loremipsum" shows the good description, which is also on meta description, while "lorem ipsum" shows some description of an article of that homepage.


Do you have any idea what's going on?





Browser plugin similar to Google Tag Assistant, but customizable?


Our data scientist push events to our database using custom events paralell to Google Analytics.


Are there any plugins similar to Google Tag Assistant, that are customizable and allow which event types do I want to see?





What kind of license we need to offer song downloads on a website


I have seen many many websites which allow you to download latest songs for free I am doubtful that they have any kind of license or partnership with any music company so if they do not have any such thing how come they are running websites with thousands of pages indexed on google. If they have any license or partnership for the same what it is? And how they get all the songs on their website from the very first day of release. I am running a website where people can listen songs online but I am using soundcloud API for that.


I am asking this question over here because I do not know why but I cannot ask question on stackoverflow.com and this question doesn't seem suitable for music community of stackexchange group.





jeudi 4 décembre 2014

Is there a way to use formulas as events in Mixpanel funnels?


Whenever my app is opened, I send either an "account retrieved" or "account created" event. I could additionally send an "app opened" event, but that feels unnecessary since the account created/retrieved events already capture it. I can create an "app opened" formula, but then I can't see a way of using that formula in funnels.


Is there a solution or workaround that will achieve this?





We can use RegEX in Robots.txt file to block url?


I have few dynamic generated URL, I can use RegEX to block this URL in Robots.txt file?





(SEO)Should I use Wordpress as a website, or contain Wordpress in my website, or neither


I'm getting ready to release some software; However I was thinking that I could create a dev-blog going for the updates that I make etc.. The question came down to three different solutions:



  • Write my own blogging software, would be much more lightweight.

  • Use wordpress, and make my entire website revolve around it. (Currently is fully integrated with a forum software, which I intend to keep).

  • Use wordpress, however keep the original website and just install WP in a subdirectory.


I'm personally leaning towards solution number one, because I can make a very clean and optimized blogging platform that's suited 100% to my needs. I won't have to worry about logging in, or spam comments, or anything like that.


However, I'm curious as to what would be the best approach for SEO. I'm just using standard HTML/CSS(3) for my website currently, my design skills aren't the best, but I can still do all of the back-end just fine.


What do you guys think? I'd also implement my own RSS feed.


Purely a question for SEO purposes.





Is it possible to create both FTP and GitHub deployment on one website in Microsoft Azure


I have a website hosted on Windows Azure and it is initially linked to GitHub deployment. My question is, is it still possible to create an FTP deployment even if I have my site linked to GitHub already? I just want to access my site through FTP. Is this possible? If yes, how can I create one?





Structured Data Helper - Pre-render SEO


We've got a page, developed using AngularJS, that uses Pre-render as a service in order to serve static HTML to bots. This is using HTML5 mode in Angular, so the URLs look like standard urls. Essentially, how it works is, if a request comes in with an _escaped_fragment_ argument, we service a static HTML file.


Within this static HTML are all the structured data markup. We're trying to test this with the structured data helper. If I request a page with http://ift.tt/1vSMuiF, the structured data helper doesn't pick up anything. If I request it with http://ift.tt/1zX15c2, it picks up all my structured data.


However - How can I confirm that this is the way Google will request my page? I was thinking the structured data helper would request the page the same way google would (ie. append the escaped_fragment bit transparently, but that does not seem to be the case).





Google Open Sans font does not display correct weight in Google Chrome


I use the Google Font 'Open Sans' on my website.


I noticed the incorrect font weight was being displayed for certain links. My regular text has a font-weight of 300, while my links have a font-weight of 400. However, all the links look the same as the regular text.


I navigated to the Google Fonts website:


http://ift.tt/1dqsWsf


and noticed something interesting.



  • On Windows, on Google Chrome, the Light 300 and Normal 400 versions are identical.

  • On Windows, on Firefox, IE, and Safari they are not.

  • On Ubuntu, on Google Chrome, they are not.


The issue seems to be isolated to Google Chrome (latest version) and Windows (version 8.0).


My fonts are loaded onto my website using a link in the header.



<link href='http://ift.tt/OKKK6E' rel='stylesheet' type='text/css'>


However, the issue is not isolated to my website, as the Google Fonts site itself shows the same issue.


I wanted to ask for someone's help with this issue. I have read online about other issues with Google Fonts, such as smoothness, but nothing on font-weight.





Content Grouping Not Showing in Google Analytics


I am working on a Google Analytics project and am trying to implement Content Grouping.


I enter the 'Admin' panel, and look under 'View' between 'Goals' and 'Filters' but 'Content Grouping' is not there.


I should have access to this with even the lowest permissions granted from my client, correct? Is there somewhere else I can find the Content Grouping settings?


Thanks for the help





Linking Domain To Facebook [on hold]


I am currently trying to build a website, my domain name is FabulousEarnings.com, how do I link domain to facebook also? I have tried getting help through Godaddy with no success.





Analytics reporting all subdomains but not the main domain


My main domain is say example.com. Now I have a dozen subdomains and I have created Views for each subdomain with this filter: "Include only > traffic to the hostname > that contain"


So for the subdomain - abc.example.com - the filter will be:-


Include only > traffic to the hostname > that contain = abc.example.com


The reporting is okay for the subdomains, but for the main domain the reporting is always a blank no matter how many times a billion people visit it. I know it is tracking because you can see even tools.pingdom.com showing the js code.


These are the filters i have used for the main domain:




  1. Include only > traffic to the hostname > that beging with = example.com




  2. Include only > traffic to the hostname > that beging with = www.example.com




What am i doing wrong?





Legal to sell in the UK, but not in the USA. How does the location of my website play in to this?


I have a company in the UK which sells a product that is "Legal" here in the UK.


But its considered a medical device in the USA and do my understanding it cannot be sold "Within" the USA...


To my understanding I can ship the product to the USA, I have done so for years and my partner has done so for over a decade I believe this to be because the "sale" of the item is illegal in the USA unless you're a licensed doctor, however it's not illegal to own or posses the item.


The customer is buying from my UK Company, they money is being received by a UK Bank/Merchant Processor etc...


Currently the website is hosted, on a UK Server, but I'd like to switch to a USA Server since %95 of my customers are in the USA, so I want to speed it all up...


I'm trying to understand if I could be breaking some laws, if the website is "Hosted" in the USA, does this mean the "Sale" is taking place within the USA?





Can you create rich websites (more than just text) using Google Sites?


After trying out Google Sites to create a website, I feel it is very limited to mainly text features. I have tried to find a way to implement elements directly using HTML, but to no avail. Buttons are completely blocked, which is a great disadvantage.


I have tried JQuery, but all script is blocked.


Is there anyway to implement elements like <img> using Google Sites HTML?





Starting with Google Calendar in PHP


I have to synchronize a website in which you create an event with the Company's Google Calendar so everyone can see it in their phone. My problem is that I'm new to web development and to PHP(I'm a C person) and I have to program it in PHP. I have been searching for a guide or anything but I can't find one(and the one in Google is for v1 with Zend Framework) Any other guide that I found doesn't explain everything from start, so I have a bunch of random codes that doesn't work with the first steps.


http://ift.tt/1995tWM


I'm looking for a template of that function(a complete template(with the imports)) to work based on that. Does anyone have/knows one?


Thank you!





How do i build a video streaming website with my own server?


I wanted to build a website with video streaming features (like youtube) on which i would upload limited amount ( 90 approx) of lecture videos of 45 minutes each. But as soon as i started searching online to accomplish this i found nothing that can guide me to how to build one. Yes i knw its unbelievable that i cant find it on internet but its true. Every other person was suggesting some site that will make it for me or any other method. Someone in comments section gave an honest opinion that no commercial servers will do this properly and will cut corners so i decided that i will make my own server from my old laptop ( if its possible). So my final confusion comes down to how can i design a web page where i can place videos such that its not a youtube player/ any other player which directs viewers to their site. I want a independent website where i am the master ( after buying a domain name). Please tell me the process to build such a website in simple words ( i am not a programmer and yes i will google stuffs you tell me in order to understand technical terms ;) ).





Is it expected behaviour that Google also crawls #! urls?


I'm a developer for an AJAX webapp and, following Google's specification for crawlable webapps, we use #! to indicate that it's a AJAX application such that we can serve a static page to Google instead. This all works perfectly fine: Google fetches the ?_escaped_fragment_ URLs instead.


However, in the logs we found that even though we follow this specification Google also fetches the original AJAX pages, and in the process it generates script errors.


Is it expected behaviour that Google visits the AJAX URLs, even though it knows we have specially prepared pages for it? I can imagine that Google does this to train it's AJAX crawler, but I cannot find any information about it.


Additionally, does this have any influence on our ranking?





Random domain pointing at my dedicated IP


I have a random domain name which has it's A record going to my dedicated IP. Therefore duplicating content. Is there a way i can disable the domain at ip level maybe in iptables?


I have htaccess setup to redirect to the main domain as its always force https.





Can a website ONLY have a generic top level domain


With the introduction of Generic Top Level Domains that groups can apply for, will it be possible to have a website's host consist of only a top level domain. For example, would it be possible for example company to use http://example/, after registering "example" as their TLD, instead of http://example.com/?





mercredi 3 décembre 2014

What is the difference between microdata and microformats?


I have searched quite a bit for this and I am not happy enough with what I have read. I also found these questions here:


Microdata vs. Microformats


What are the advantages of Schema.org's microdata vs Microformats et al?


I have a news/social network site in spanish and wanted to know what would be the best format to implement.


Currently, I have both running. But I want to know if it is better to have only one, if so, which one would best fit my site? Or, can I keep both of them?


I want to target most search engines, not only google.





VPS resourcers warning dcachesize/oomguarpages


I am running 1 vbulletin forum, 1 wordpress, 2 opencart (e-commerce) and 3 simple html sites


And sometimes i get a warning of:



Dec 04, 2014 12:11:29 AM Resource Resource dcachesize green alert on environment 113 example.com current value: 22256812 soft limit: 26210000 hard limit: 26210000 Green zone dcachesize

Dec 04, 2014 12:18:29 AM Resource Resource dcachesize yellow alert on environment 113 example.com current value: 22499725 soft limit: 26210000 hard limit: 26210000 Yellow zone dcachesize

Dec 04, 2014 12:11:29 AM Resource Resource oomguarpages yellow alert on environment 113 genesiseries.com current value: 168249 soft limit: 191146 hard limit: 191146 Yellow zone oomguarpages

Dec 04, 2014 12:13:30 AM Resource Resource oomguarpages green alert on environment 113 genesiseries.com current value: 116764 soft limit: 191146 hard limit: 191146 Green zone oomguarpages


My CPU usage:


0.7% CPU Load Average: 0.72, 1.06, 0.86


RAM: 524.6 MB 39%

Free: 819.4 MB

Total: 1.3 GB


What can be happening? what is "dcachesize" and "oomguarpages"?





Would it be wise to prepay for 10 years on domain registration?


I have a domain name coming up to expire soon and i use it for business. Would it be wise to just pay for 10 years assuming prices are going to go up? I can't forsee it going down.





What will be the difference between a hosted website and google site website?


I am thinking of a startup in which ill make a website in which i will upload my lecture videos and people can watch it for free and if they want they can login (for free) and i will make money from ads on my website. But since i have to pay intially for domain name etc in hosting a website ( from godaddy) so i thought why not i go for google site which is free and where i can see that my lectures are getting popular or not (for free). But my friend said that even if i make a website and host it from godaddy i will can money from ads and it will cover my expenses and going with google sites is not profitable because sometimes google adsense dosent pays even if u deserve it. So basically both methods are kind of free (i think so... please do tell if i am wrong on this). I am confused about this. Please tell me pros and cons of both the methods. Which method can fill my vallet more.





My domain is now pointing to something called Leader Essential Tools [on hold]


Had my AWS instance on a dotster registered domain, it was working well for a while. I took my instance down to do some reconfiguration, and going to my domain name now comes up with something called Leader Essential Tools, which I can't find mentioned really anywhere. No clue why this is happening. Any clues?





Html site map for large site [on hold]


I have website with huge number of pages(400K), I have generated xml sitemap, it's ok there are a lot of information how to do it, but I am not sure how to implement html sitemap. Do I need to include only categories(about 300), or it would be great if I add categories and all products? If yes, then how to separate product pages, create some sort of paging?





New Facebook page not indexed,


my facebook page was first on google 2 days ago i found this msg under my url link on google ( Description is not available for this result because with this site of robots.txt )


is there is any solve for this problem


regards khaled





I just moved entire website to a new server, but am still getting traffic to my old one (caching issue?)


I just sold a website, so I transferred everything to new owners' server. The name servers are now pointing to his server as well. I also replaced all of my Adsense tags with the buyers'. However, I'm still getting a small number of daily Adsense views to all of my tags even though none of them are on the site anymore (1,000 page views per day or so).


My question is this: is there a caching issue here or something? Are DNS providers not honoring the new server location and will it just take time for this issue to go away?


How can I resolve this issue?





google can't log in to my facebook page [on hold]


in the Google search engine they typed under my Facebook page URL (this page can't be logged on due to an error in robots) my page URL : http://ift.tt/1vlhhzt HOW CAN SOLVE THIS PROBLEM REGRDS KHALID





How can I see a graph of Google Play Referrals on Google Analytics


I am able to see the referrals in the referral flow section. but i cannot find them again anywhere else.


I need to see stats about each referral over time. Any way to so this?


enter image description here





Find when url was indexed by Google


How do I find out when a particular, not owned by me, url was indexed by Google?


If the question is off-topic please suggest which SO site would suit better.





a domain's NS record point to itself


As I understand that if I have domain A.com, it must be managed by another domain like ns1.B.com, ns2.B.com



$ dig ns alibaba.com

; <<>> DiG 9.9.2 <<>> ns alibaba.com
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 45070
;; flags: qr rd ra; QUERY: 1, ANSWER: 4, AUTHORITY: 0, ADDITIONAL: 0

;; QUESTION SECTION:
;alibaba.com. IN NS

;; ANSWER SECTION:
alibaba.com. 76962 IN NS nshz.alibabaonline.com.
alibaba.com. 76962 IN NS nsp.alibabaonline.com.
alibaba.com. 76962 IN NS ns8.alibabaonline.com.
alibaba.com. 76962 IN NS nsp2.alibabaonline.com.

;; Query time: 6 msec
;; SERVER: 192.168.1.28#53(192.168.1.28)
;; WHEN: Wed Dec 3 18:56:43 2014
;; MSG SIZE rcvd: 117


but what I don't understand is:



$ dig ns aliedge.com

; <<>> DiG 9.9.2 <<>> ns aliedge.com
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 48684
;; flags: qr rd ra; QUERY: 1, ANSWER: 4, AUTHORITY: 0, ADDITIONAL: 0

;; QUESTION SECTION:
;aliedge.com. IN NS

;; ANSWER SECTION:
aliedge.com. 140046 IN NS ns4.aliedge.com.
aliedge.com. 140046 IN NS ns2.aliedge.com.
aliedge.com. 140046 IN NS ns3.aliedge.com.
aliedge.com. 140046 IN NS ns1.aliedge.com.

;; Query time: 6 msec
;; SERVER: 192.168.1.28#53(192.168.1.28)
;; WHEN: Wed Dec 3 18:58:32 2014
;; MSG SIZE rcvd: 101


aliedge.com is authorized nameserver to it self, how can this happend?





I need help designing first website


Let me start by saying, that I did try to post on the correct StackExchange site.


*Background: I am an undergrad college student, and am majoring in an IT related field. I am looking to build a website - nothing fancy. Topics will likely include both personal and IT related/InfoSec related content as that is what I want to major in.


*Purpose of the site: Deliverable - may help me in job search when I am fresh out of college if it is well designed.


*Features: PayPal donation button (I know I need the API), SSL maybe, Firewall if not provided by hosting provider, and Private WHOIS.


Question(s):


1.) Who to use for Hosting. WordPress was recommended by someone. I looked into GoDaddy and for a bare bones site after the first year ($1/month, minimal features), it goes to about $12/month.


2.) Platform to use for development (avoiding Yahoo).


3.) Hosting Provider (balancing security and cost). Do I need something like Securi?


4.) Is WordPress really decently secure? I keep reading about multiple new frequent vulnerabilities.


5.) Do I need an SSL certificate? If so, where is a legitimate place to get one for cheap?


6.) Funding ideas for the site (I am a college student with not a lot of discretionary income.)


What I have researched: I considered affiliate networks, maybe a PayPal button on the site, or worst case Ads. Any ideas for initial funding? Based on the limited experience in a college course learning limited HTML " that I took 4 years ago, I am pretty sure I should use Microsoft FrontPage or some type of Website Designer.


Any/all suggestions would be greatly appreciated!





How to install HTML Plug-ins for Weebly Website Builder


I've been using Weebly for a year now and I want to use its embed code feature to maximize customization. I want to use Bootstrap or another HTML "plug-in", but I can't get that to work.


Is there any way to install an HTML plug-in for Weebly? (I'm not asking for a recommendation of anything, I just want to know how to apply a plug-in to a Weebly site.)





FFXIV Gil Site online reports


Based on The Cheapest FFXIV Gil Site online reports, Moonfang was going down in 15 to 30 minutes, and relatively simple in terms of mechanics. The only problem is the mechanics necessitate she hits you for so much that you die quite often, so it


should be a long fight.You may do something wrong if you fail to do so Even though it succeeds for a few players to take down Moonfang in Darkmoon


Faire, but it seems very hard for other players to find their way. For many times, players seem to kill each other, rather than fighting against Moonfang. http://ift.tt/Y6ZGCk





mardi 2 décembre 2014

The bath fans those who developed


The bath fans those who developed the sleek in the first position get this wrong music that can be Garcinia Wow extremely complicated get it right and you get it immediate achievements enhance how so many query for many top nutritionist can actually make you get better not there ok it's only get this wrong but we’ve got the answer to this and other concerns will shock and surprise you the secret to achievements to two wellknown but often ignore ingredients you use daily once you implement this power http://ift.tt/1vJAdgl





MediaWiki user preferences - hide some fields? [on hold]


We've integrated MediaWiki with our website and wish to block/hide the editing of some fields in the Special:Preferences page so these cannot be changed by users.


Specifically: we don't want users to be able to change their email and language.


Thanks





Adding metadata to urls to filter/scope in google analytics


I have a site where I am trying to gather analytics on the category of a url. For example, we have stories for both dogs and cats but the urls for both stories are under "/animals/"


I'm trying to figure out how to pull the number of pageviews for each category. I was thinking I would go about doing this by pulling out the filtered urls of each category and applying metadata in a data import to Google Analytics. I was able to import the data, but am unable to filter the data based on the import. I have no way of knowing if the import matched any items, if it was successful or failed.


Curious if anyone else has attempted a similar and what your process has been.





Why does SSH ask for password as one user and not another?


My SSH config contains two entries to SSH into the same host with different usernames



Host babyroot
Hostname XXX.XXX.XXX.XXX
User root
ServerAliveInterval 300
IdentityFile ~/.ssh/id_rsa
ForwardAgent yes

Host babyubuntu
Hostname XXX.XXX.XXX.XXX
User ubuntu
ServerAliveInterval 300
IdentityFile ~/.ssh/id_rsa
ForwardAgent yes


ssh babyroot let's me in fine.


ssh babyubuntu asks for a password.


On the remote machine roots .ssh directory looks like this:


http://ift.tt/12pjtjo


and ubuntus .ssh looks like this:


http://ift.tt/1FM4azA


Both authorized_key are identical



ssh babyroot -vvv gives

OpenSSH_6.2p2, OSSLShim 0.9.8r 8 Dec 2011
debug1: Reading configuration data /Users/XXX/.ssh/config
debug1: /Users/XXX/.ssh/config line 1: Applying options for *
debug1: /Users/XXX/.ssh/config line 293: Applying options for babyroot
debug1: Reading configuration data /etc/ssh_config
debug1: /etc/ssh_config line 20: Applying options for *
debug1: auto-mux: Trying existing master
debug2: fd 3 setting O_NONBLOCK
debug2: mux_client_hello_exchange: master version 4
debug3: mux_client_forwards: request forwardings: 0 local, 0 remote
debug3: mux_client_request_session: entering
debug3: mux_client_request_alive: entering
debug3: mux_client_request_alive: done pid = 45382
debug3: mux_client_request_session: session request sent
debug1: mux_client_request_session: master session id: 2
Last login: Tue Dec 2 18:51:29 2014 from YYY
root@baby:~#


where as ssh babyubuntu -vvv gives



OpenSSH_6.2p2, OSSLShim 0.9.8r 8 Dec 2011
debug1: Reading configuration data /Users/XXX/.ssh/config
debug1: /Users/XXX/.ssh/config line 1: Applying options for *
debug1: /Users/XXX/.ssh/config line 300: Applying options for babyubuntu
debug1: Reading configuration data /etc/ssh_config
debug1: /etc/ssh_config line 20: Applying options for *
debug1: auto-mux: Trying existing master
debug1: Control socket "/Users/XXX/.ssh/sockets/ubuntu@AAA" does not exist
debug2: ssh_connect: needpriv 0
debug1: Connecting to XXX [XXX] port 22.
debug1: Connection established.
debug3: Incorrect RSA1 identifier
debug3: Could not load "/Users/XXX/.ssh/id_rsa" as a RSA1 public key
debug1: identity file /Users/XXX/.ssh/id_rsa type 1
debug1: identity file /Users/XXX/.ssh/id_rsa-cert type -1
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_6.2
debug1: Remote protocol version 2.0, remote software version OpenSSH_6.6.1p1 Ubuntu-2ubuntu2
debug1: match: OpenSSH_6.6.1p1 Ubuntu-2ubuntu2 pat OpenSSH*
debug2: fd 3 setting O_NONBLOCK
debug3: load_hostkeys: loading entries for host "XXX" from file "/Users/XXX/.ssh/known_hosts"
debug3: load_hostkeys: found key type RSA in file /Users/XXX/.ssh/known_hosts:2
debug3: load_hostkeys: loaded 1 keys
debug3: order_hostkeyalgs: prefer hostkeyalgs: ssh-rsa-cert-v01@openssh.com,ssh-rsa-cert-v00@openssh.com,ssh-rsa
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug2: kex_parse_kexinit: diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
debug2: kex_parse_kexinit: ssh-rsa-cert-v01@openssh.com,ssh-rsa-cert-v00@openssh.com,ssh-rsa,ssh-dss-cert-v01@openssh.com,ssh-dss-cert-v00@openssh.com,ssh-dss
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
debug2: kex_parse_kexinit: hmac-md5-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-ripemd160-etm@openssh.com,hmac-sha1-96-etm@openssh.com,hmac-md5-96-etm@openssh.com,hmac-md5,hmac-sha1,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: hmac-md5-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-ripemd160-etm@openssh.com,hmac-sha1-96-etm@openssh.com,hmac-md5-96-etm@openssh.com,hmac-md5,hmac-sha1,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: none,zlib@openssh.com,zlib
debug2: kex_parse_kexinit: none,zlib@openssh.com,zlib
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit: first_kex_follows 0
debug2: kex_parse_kexinit: reserved 0
debug2: kex_parse_kexinit: curve25519-sha256@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
debug2: kex_parse_kexinit: ssh-rsa,ssh-dss,ecdsa-sha2-nistp256
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,chacha20-poly1305@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,chacha20-poly1305@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
debug2: kex_parse_kexinit: hmac-md5-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-ripemd160-etm@openssh.com,hmac-sha1-96-etm@openssh.com,hmac-md5-96-etm@openssh.com,hmac-md5,hmac-sha1,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: hmac-md5-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-ripemd160-etm@openssh.com,hmac-sha1-96-etm@openssh.com,hmac-md5-96-etm@openssh.com,hmac-md5,hmac-sha1,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: none,zlib@openssh.com
debug2: kex_parse_kexinit: none,zlib@openssh.com
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit: first_kex_follows 0
debug2: kex_parse_kexinit: reserved 0
debug2: mac_setup: found hmac-md5-etm@openssh.com
debug1: kex: server->client aes128-ctr hmac-md5-etm@openssh.com none
debug2: mac_setup: found hmac-md5-etm@openssh.com
debug1: kex: client->server aes128-ctr hmac-md5-etm@openssh.com none
debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent
debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP
debug2: dh_gen_key: priv key bits set: 109/256
debug2: bits set: 520/1024
debug1: SSH2_MSG_KEX_DH_GEX_INIT sent
debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY
debug1: Server host key: RSA 84:cd:56:8f:f7:as:45:58:ee:ds:91:9a:f6:fd:6d:6a
debug3: load_hostkeys: loading entries for host "XXX" from file "/Users/XXX/.ssh/known_hosts"
debug3: load_hostkeys: found key type RSA in file /Users/XXX/.ssh/known_hosts:2
debug3: load_hostkeys: loaded 1 keys
debug1: Host '162.243.238.160' is known and matches the RSA host key.
debug1: Found key in /Users/XXX/.ssh/known_hosts:2
debug2: bits set: 515/1024
debug1: ssh_rsa_verify: signature correct
debug2: kex_derive_keys
debug2: set_newkeys: mode 1
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug2: set_newkeys: mode 0
debug1: SSH2_MSG_NEWKEYS received
debug1: Roaming not allowed by server
debug1: SSH2_MSG_SERVICE_REQUEST sent
debug2: service_accept: ssh-userauth
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug2: key: /Users/XXX/.ssh/id_rsa (0x7fb9cb501f70), explicit
debug1: Authentications that can continue: publickey,password
debug3: start over, passed a different list publickey,password
debug3: preferred publickey,keyboard-interactive,password
debug3: authmethod_lookup publickey
debug3: remaining preferred: keyboard-interactive,password
debug3: authmethod_is_enabled publickey
debug1: Next authentication method: publickey
debug1: Offering RSA public key: /Users/XXX/.ssh/id_rsa
debug3: send_pubkey_test
debug2: we sent a publickey packet, wait for reply
debug1: Authentications that can continue: publickey,password
debug2: we did not send a packet, disable method
debug3: authmethod_lookup password
debug3: remaining preferred: ,password
debug3: authmethod_is_enabled password
debug1: Next authentication method: password
ubuntu@XXX's password:




Why is a local ip address the largest referrer in Google Analytics?


after 2 weeks of running Google Analytics on a site I see most of the traffic is coming from this IP address: 192.168.110.1


I've added a filter to exclude it now, but why is it there and could it be a valid referrer?


enter image description here





Deny server from adding code to my page?


My host adds javascript in particular cases. I want to prevent that. Maybe through .htaccess?


The added code is the following


So, even preventing it's php from executing would be useful.


Thanks in advance!





Does IIS 7.5 use a separate "AppDomain" for each subdomain?


Within an IIS worker process, separate sites (i.e. domains) run in their own app domain.


I have a site example.com which has a wildcard DNS record (i.e. *.example.com pointed to an IP.) Will processing for a subdomain resource z.example.com be isolated into a separate appdomain from handling of requests for www.abc.com? I think the answer must be yes, but can't find anything definitive





Cause a PDF from a website to be viewed automatically, even on phones


I have a PDF on my website. It works fine when on a computer, but on a phone, there is an extra step for the user. After going to the link with the PDF, they then have to click a link to view it. Is there anyway I can bypass this link? I am assuming not for security reasons, but figured I'd take a shot.



<object data="Resources\myPDF.pdf" type="application/pdf" width="100%" height="100%">
<p><a href="Resources\myPDF.pdf">It looks like you are on a phone. No worries though, just click on this text to
view my PDF in your mobile browser.</a></p>
</object>


If on a computer, myPDF.pdf opens and is shown in the browser when navigated to the page containing the above code. However, when on a browser, the link needs to be clicked in order to view myPDF.pdf. The only thing on the page is myPDF.pdf, so it is OK if nothing else is on the page. Any ideas?


I am looking for a generic solution for all phones, but the only phone I have actually been able to test it on is an iPhone using Safari. Other browsers may give a different result or require a different solution than the scenario I described above.





Auto-updating RSS feeds from page content divs


My site has about 40 separate RSS feeds cordoned off in their own directories. For years I maintained these quite easily using the Dreamfeeder Dreamweaver extension. When I created and published a new alert (an HTML page in the same dir as the .rss file) and "processed" the feed (this involved simply clicking a gear icon), Dreamfeeder would automatically update the .rss file with a new item (extracting the headline and body content from the new HTML file in its respective directory based on div ids). I could then publish the updated .rss file, where Twitter would find the new content within 30 minutes and tweet it (using TwitterFeed).


This was an extremely effective process (though the one-time initial setup for each feed was a bit involved). Problem now is that Dreamfeeder has not been developed since 2009 and is not compatible beyond Dreamweaver 5. Having recently upgraded to Dreamweaver CC, for the life of me I can find anything even approaching a suitable substitute. I keep landing back at FeedForAll, but near as I can tell FFA can't scan directories for new files or extract content from HTML based on div ids, and instead feed items have to be copied and pasted into FFA by hand. Unless I'm missing something...?


So in effect, upgrading Dreamweaver has clunked up my process. I miss the elegance of the old way.


Any suggestions on tools that can accomplish this task? No amount of Googling has turned up anything useful yet, and I'm just Googling in circles now. I don't care whether the solution is integrated into Dreamweaver (in fact, it would be more flexible if it wasn't), I just don't want to have to go back to manually updating dozens of RSS feeds. Any input is appreciated.





Can I use sitemap index for media RSS (mRrss) files?


I have a lot of mRSS (media Rss) files. I plan to create sitemap index to contain them.


Are sitemaps allowed to contain mRss files? To my knowledge, they are only allowed to contain image, video, and content in sitemap index. It is right?





No one can access our website today using http, but can with https


We arrived at work this morning to find our website was 'down'. It loads a blank white page with no messages at all. It worked fine last night and we have made no changes. (I am the only one who has access to make changes.)


It is very odd in that I can access it if I use the https prefix but, as soon as I try to use the normal http prefix (or click on any link on the https page that loaded), I get the blank white page. I am using Firefox but have tried in IE as well. I have tried from two different local computers and people have tried from elsewhere in the country with the same results. I have also tried with Safari on Verizon's network. So I believe I have eliminated our local machines and our ISP from the equation. If it matters, I have also tried pinging our website from a command line and it responds quickly with no packets lost.


Our hosting company (Powweb) initially told us they can view the site just fine from their end but, upon hearing all the information from above, they escalated it to their top support priority level. However, they have been looking at it for several hours.


This is catastrophic because we're an online retail sales company and it's the Christmas season! Someone please help!!





What web hosting service should i go for?


I want to launch a website and since i am really new at this i am not able to decide what web hosting service will be best for me. Godaddy is the name i am hearing everywhere but few people say that its not good and theymake fool of us by applying hidden charges, stealing domain name etc. So i just want to know what are the alternatives present better than godaddy. And is godaddy really good.. would you suggest me to go with godaddy ?





How is a slash interpreted by search engines? SEO Best Practice


I would like to know your opinions on the following URL structure. Is the following SEO best practice and why?


If targeting search term "blue suede shoes"...


Would "example.com/blue/suede-shoes/" be read appropriately by search engines for SEO benefit?


I have a rather large site with many product/service offerings and require the collective's expertise.


My concern is that targeting many services with structure "example.com/blue-suede-shoes" would not only become a web management issue but not be best practice for user experience. However, strictly SEO speaking, which is best?





Google uses the old blog title in search results


As a matter of optimization, a few weeks ago I modified the way the titles of my blog posts are displayed (I made a change also in the title of the blog). Then I used the google webmaster tools to re-index the old pages and update titles. Immediately I could update the titles and posts appear correctly on google.


The problem is that now the titles of posts began to appear incorrectly in the search, using the old blog title. Some posts continue to display the updated title, but many recent posts show the old title.


I've tried re-index the pages with google webmaster tools, but the titles remain unchanged in the search results.


Is there any way to "force" google to display the correct titles?





multi-page views and avoiding duplicate content


I have a website hosting thousands of photos that we taken in real life. My problem comes down to advanced pagination.


Currently, I have the site divided into photo albums then galleries in each album which is ok. The problem begins when the gallery starts and the individual pictures are accessed from an SEO perspective. The reason is because in each gallery, people can choose to either view a small number of photos or a large number of photos at a time. These numbers are always chosen from a fixed list. (From this point on, I will refer to these numbers as NPP). To make navigation simple and concrete to both search engines and people and to minimize code size, each image in the gallery requires the NPP in the URL. The problem is that the page could be duplicated the number of times there are different NPP values to choose from. In my setup now, people can choose any of 4 NPP values and thus each image page could be duplicated 3x according to search engines.


If I confused you, I'll explain the rough folder structure used on my site to illustrate the problem. Each folder below is relative to document root for the domain.



/ = Photo albums home page
/venue-name/ = Photo gallery listing for venue.
/venue-name/MM-DD-YYYY/ = Redirect to 500 photos per page for same venue and dated MM-DD-YYYY.
/venue-name/MM-DD-YYYY/500pp = 500 photos per page for same venue and same date
/venue-name/MM-DD-YYYY/100pp = 100 photos per page for same venue and same date
/venue-name/MM-DD-YYYY/200pp = 200 photos per page for same venue and same date
/venue-name/MM-DD-YYYY/50pp = 50 photos per page for same venue and same date
/venue-name/MM-DD-YYYY/50pp/image/1 = 1st image for same venue and same date
/venue-name/MM-DD-YYYY/100pp/image/1 = 1st image for same venue and same date
/venue-name/MM-DD-YYYY/200pp/image/1 = 1st image for same venue and same date
/venue-name/MM-DD-YYYY/500pp/image/1 = 1st image for same venue and same date


As you can see, everything is fine until the individual images part (last 4 lines). The page in each of the last 4 URL's is exactly the same. I have thought of cookies and tried cookies to store the NPP but that wouldn't help because a user might have different preferences from a search engine and I'm also running advertisements and do not want a robot thinking a page is messed up because it displays differently to a user than to a robot all because of a cookie.


Now if this was my site and users didn't care, then I would only use 100 for NPP and follow more of the 100 links per page rule and then I can eliminate the rest of this but unfortunately, the majority of the users want NPP of 500. I added the other options so others don't have to wait long to see pictures.


Also, because I'm referencing galleries from elsewhere on the site, I feel I have to make a concrete decision on the NPP or use a cookie, but because I'm running advertisements I don't want to confuse the robots.


I have also tried this:



/venue-name/MM-DD-YYYY/image/1 = 1st image for same venue and same date


But If I do that then when the user uses the go back feature, then they can't keep their preferences without a cookie because nothing in the URL shows the NPP.


Does anyone have an idea how I should go about fixing this potential duplicate content issue without making users lose their settings or making robots messed up?





Should "next posts pages" be included in the sitemap?


I wonder if including the "next posts pages" to sitemap would be a good idea. With "next posts pages" I mean for example domain/page/{num}, or domain/category/foo/page/{num}, or even domain/tag/foo/page/{num}.


I'm using the Wordpress SEO by Yoast Plugin and it doesn't include those pages.


Of course, the content of those pages is also included in the single posts, but maybe would be better to include all pages so that Google can index more pages.





Apache error MAMP virtual host


Trying to run a project built in ubuntu with php on fat-free framework.Now i am using MAMP on macOS Yosemite 10.10.1.


I imported a database succesfully through phpmyadmin & following these instructions http://ift.tt/1yeqtM1 my httpd-vhosts.conf file looks like this:


ServerAdmin webmaster@localhost



DocumentRoot "/var/www/site"

<Directory "/var/www/site">
Options -Indexes FollowSymLinks Includes
AllowOverride All
Order allow,deny
Allow from All
</Directory>


ErrorLog ${APACHE_LOG_DIR}/error.log

# Possible values include: debug, info, notice, warn, error, crit,
# alert, emerg.
LogLevel debug

CustomLog ${APACHE_LOG_DIR}/access.log combined


The project files are located at the "htdocs" web root folder in /Applications/MAMP/htdocs Note that when i am trying to run it on this form the Apache server throws error , it does not run. But when i comment the ErrorLog section the apache server runs properly but all i get is a blank page!


Does anyone come up with this problem before , it is taking me ages to get this running! Any help would be valuable ! Thank you





Duplicated content - how to tell Google I am original author


Let's say I prepared an article I wish to publish on my not so popular blog. But I am afraid that this article is so good that much more popular site will steal it and publish it on its own. I am guessing that because of low popularity of my blog, the Google would show links to the bad site instead of my own. Is there a way to prevent such outcome? What if I would prepare an article, publish it on my blog and later decide it it so good that I'll put it on Wikipedia, is there a way that Google would consider me above Wikipedia?





Having Traffic on website


I have a very simple query that I need solving.



What are the advantages of having traffic on your website?


Will it raise your SEO ranking on google? Please exclude the fact that it will raise awareness of your product/server, assume that the visiters are robots that only view the pages.


Some information regarding this would be appreciated.


Regards





Indexing Problem


In google webmaster tools appears that I have 309 pages indexed, but when I search using "site:site.com" appear about 180 only (On the first page appears "about 608 results," something near the total number of posts).


I have also noticed that some pages indexed before simply disappeared from google. In google webmaster tools there is no error message or anything to indicate any punishment by google.


According to the google webmaster tools, daily new pages are indexed, but for some reason they are not displayed in the search.


I'm really being affected by some punishment by google or this is some kind of bug?


How can I solve this problem and have all pages indexed normally?


*Sorry for my bad english





Can I use Google Analytics to track a specific referrer URL?


I have a Featured Product block on my website's home page. I'd like to track it's efficacy by adding a handle ?referrer=featured to it and then have Google Analytics track checkout on said product.


I'm assuming I'd need to set up a goal to do so, I looked at the options via AdminGoalsNew Goal but can't seem to find anything relevant there.


Will I need to put a specific tracking code in the template page for the product? The tricky bit is that the featured product is changed daily, so I'd like to track if anybody buys any featured product by clicking on the block, hence why I was thinking of the URL handle.


Any ideas? Much appreciated.





Show warning content only once in 30 minutes


I've been trying to write the code but I failed everytime somewhere.


I need to show div #warning only once in 30 minutes.


I use jquery-1.10.2.min.js and I'd like to not use nomore js if it's possible.


Thank you





Can I use multiple Paypal Products together?


I am building a complex website that has multiple scenarios. I need to know if I could use multiple paypal products in 1 website as they are needed based on the need.


Here is what I have in my mind:


1) Express Checkout at "Signup" (joining)


2) Express Checkout (recursive payment) for "Periodic Dues" that starts after 1st period (as first period is already covered in joining fees)


3) Paypal Mass Pay for Revenue distribution with our users who paid to join in step 1.


4) use Adaptive Payment (Parallel) for "Buy" and "Sell" of user-products between users (they will add products on their own, and sell on their own.. I will just get a percentage).


What do you say? is it doable?





lundi 1 décembre 2014

What works for you even if you were


What works for you even if you were eating a conventional corner conventional black bean just know it still be a lot less processed a Slimera Garcinia Cambogia lot better for you been allotted is that their products something to show you to variations-kinda similar little snack so this first 41 ok is going to include the black beans corn in line now again in this video I'm kind of trying to show you a lot of different things then you can piece together what works for you cell if all you had was just these three ingredients .


http://ift.tt/12jjc1t





how to find unknown keywords that I rank for


It's easy to find where you rank for keywords that you want to measure.


But how do you find out what other keywords you rank for that you don't know about?





Good responsive skin for MediaWiki 1.24? [on hold]


All I was looking for is a nice responsive design that would make MediaWiki look nice and functional "out of the box". While I imagined installing nice stable skins on MediaWiki should be straight forward, I unfortunately find out this isn't the case. Many of the skins presented on http://ift.tt/1vDtZP9 and even the "stable" ones on http://ift.tt/1vDu05n seem to be somewhat broken. Am I missing some nice resource where skins can be added without the hassles of fixing some partially ready product?


Here are some of the skins I tried and problems arising from their use:



  • vector - the default looks horrible on a narrow mobile phone display. The other skins that are bundled by default suffer from the same problems + an uglier deisgn MonoBook Modern CologneBlue

  • bootstrapskin - the search box's "go" is under the search box

  • bootstrapskinmini - most top links are back to their own website, really?!?!

  • Refreshed - no images, looks broken





Custom events that happen during the same pageview?


Is it possible to get a report that shows custom event #1 occurred during the same pageview as custom event #2?


Use case:




  • Person views all slides in a carousel




  • Person adds product to cart




We want to know this happened when they were on the same page. We don't care if they viewed all slides of a carousel on a different page and ended up adding to cart on a different pageview (it could be a different product).


The only way I can think of is we need to handle this in the programming on our side, and fire a different custom event. It seems like there should be a way to have GA track it on their end.





Mediawiki: How to escape '=' in Template parameter values


I want to pass an URL to my Mediawiki template as a parameter, but if it contains =, it treats the URL as a named parameter. Therefore, the following does not find and replace the template parameter {{{1}}}:



{{MyUrl|http://ift.tt/1wdJr5P}}


Escaping = with %3D leads to problems of passing the link in the browser (Chrome) — the server responds with 404.


Also, installing extensions on this Mediawiki server is not recommended. Can this be done in Vanilla Mediawiki?





MediaWiki links to new articles don't work even after the new article is created until the page with the link is edited


I'm experiencing this issue with MediaWiki 1.23.6.


First I update the Main Page with a link to a new article:



[[New Article]]


Then I click on that link (which is red because the article doesn't exist yet) and create the article.


Then I go back to the Main Page.


The link to [[New Article]] still appears red instead of blue. MediaWiki doesn't detect that the article exists until I edit Main Page again.


It's not a browser cache issue. And I don't have any caching enabled in LocalSettings.php either. What's going on?



$wgMainCacheType = CACHE_NONE;




Why is my bandwidth all of a sudden being sucked by bots? [on hold]


In all the years we've had our website we have never had a bandwidth problem with bots. I'm a new webmaster and implemented a new Wordpress with a new theme and the problem started after I got hired. Can you guys give me some ideas as to why a new wordpress theme might be the cause of all these bots sucking up the bandwidth all of a sudden? Thanks!





How can I lose google pagerank?


I know that this question could sound weird but I was wondering what is the best way of losing pagerank without getting removed from Google.





proxy eror on web freer and chrome


how to use chrom withot proxy? can free gate be my problem? I unmark using proxy in setting but it didn't work?enter image description here





Up-to-date list of user agent distribution [on hold]


Does anybody know where I can find a list of user agents that are used by visitors with their relative frequencies? (Including any bots, etc)


I.e. a list like the following:



  • 2% requests with ua string1

  • 1% requests with ua string2

  • etc


Clearly this depends on the site, so a not-too-obscure site as basis for the collected data would be ideal.





Website testing for some percent of visitors


What's the name of website testing method when the new feature is shown only for some users (not for all), for instance 20%? Are there some online services for that? I know about A/B testing but I'm not sure that's what I need.





i am a person who is always happy and i am the beat when coming to finish my tracks every day with my own mind


I am solo craft mahlatse masela from hammanskraal at kanana . I am doing hip hop music and I do rap every day with my friends on Street





What changes are coming that would reduce the problems assosciated with HTTPS compared to HTTP?


I have a web site under development. It is used for surveys but there's an advertising component. As it is very new I have used almost all the latest technologies. AngularJS, ASP.NET WebAPI 2 etc.


I am consider what I should do about HTTP and HTTPS. I want my site to appear secure and for me it's important that the user might see a lock or even better company name appearing in the browser search bar to emphasis security.


However I understand that with HTTPS there are some drawback such as the time taken to encode. Going forward is it likely that these drawbacks might become less important so that for example in 5 years time HTTPS may become more of a standard rather than exception?





change to www from naked domain impact on seo


I have set up my website from the beginning to be used without the www, ie 'naked', with the following command at .htacess and everything is working fine.


RewriteCond %{HTTP_HOST} ^www.soeezauto.ma [NC] RewriteRule ^(.*) http://soeezauto.ma/ [R=301,L]


Now, I have placed the website on Cloudflare for optimization reasons and it seems that they are only able to handle www domains ( if you come through a partner host, which is my case ), so I need to change my domain to work as www.mydomain.com


My main concerns are:




  • losing INDEXATION




  • losing HISTORICAL DATA on Analytics/Webmaster




  • make sure all the web pages I have out there without www will show normally when I change to www.mydomain.com




This is what I found so far:




  • in Google Webmaster in website settings, favorite domains, it is marked as not defined, meaning neither mydomain.com nor www.mydomain.com are selected. So, should I leave as is or should I choose www.domain.com ? Or is it something else I should do?




  • in Google Analytics I found this article. Is it fine to do like that?




  • in .htaccess RewriteCond %{HTTP_HOST} ^.mydomain.com [NC] RewriteRule ^(.*) http://ift.tt/Rvl2Tu [R=301,L]




Also, I read in this question, that I should REDIRECT ALL OF MY WEB PAGES to WWW via .htaccess. Is that the case? I have about 4000 pages out there. It will slow the server down quite a lot I believe to have such a huge .htaccess.


Will that work? Last thing I want is to mess this up and lose data/indexation, not to mention have the site inaccessible to my visitors.


I appreciate any guidance.


Thanks