08 November 2005

Blog Aggregator

Aggregators take information out of blogs and the net, and post them.

But what about the opposite: What if there was a way to use a blog, and extract information from an aggregator?

I'm talking about the settings in the aggregator, and including them as specific functions within the blog.

Then I can say, "I want to post all the content in my blog related to my aggregator selections."

Sometimes people say, "We can already do that." Small problem: There are some restrictions on spam.

So, I got to thinking: What if, rather than fighting spam, we use spam-like-features, and then use that to our advantage?

That way, like the MPAA and the relationship with music downloads, we can "use what people are doing," rather than trying to make them do something else.

I'd like to be able to use the blog as a platform to read the aggregator selections, and then apply these selections so that spammers [who have content that I want] will know to drop their content in my comments. IN short, I'm already telling my aggregator what I want; why not tell the world and the spammers [who may also have stuff I want] the specific codes on how to spam my blog [and give me what I want?].

I'd also like to be able to look at a map, and notice which geographic regions are not getting my content. Why not have a blog that allows the spam to be targeted based on the content, topic, novelty, and size.

Then I can look for geographic camps in the downloads and spam, and identify which areas I want to focus on "using other methods" to get content.

This would be like giving the audience the choice on which content they want to get spammed with [if that's what they want].

Also, I'd like to be able to do RSS searches in particular PDF-formatting, not just a generic pdf-extention. I'd like to be able to have a tool that can read the layout codes, and then organize the results into specific folders based on style, color, layout, length, novelty, tone, and other parameters.

I'd also like to be able to take blog content, and auto-format it to a pdf.layout. I'd like to be able to include simple commands in the blog, and have as a draft in my blog, a visible pdf-like rendition of the content as it would appear in a pdf file including color and formatting. I'd like to be able to publish the content as text; but have my readers/aggregators display the output as either text or a PDF in the aggregator, depending on what the reader wanted to see.

I'd also like to see an RSS download monitor. I want to know which geographic regions the content is downloaded, and have this in a summary chart that is in the aggregator: This would be an RSS-data flow between the blog, site stats, and the aggregator. I want to know which of my content is getting downloaded and then I can tailor content that is similar to that to that specific reader, and then adjust my content codes so that I can identify which of my readers has a "give me spam like this"-command in their blog-comments. Then, I can click a few tic boxes to update and have my content automatically tailored to the updates in their "give me spam"-commands.

I'd like a capture command that not only identifies the style of words about specific subjects, but captures the original quote/comment and posts it to the blog in a pdf compatible format. Also, the reverse would be true: I'd like to be able to do a search with the RSS-tool, identify key comments or phrases, and then have my aggregator find which blogs have a "give me spam command" [a permission to make a follow-up to that quote] so that with one simple approval, I can have a permission-based-spam sent to those who want it. Yes, spam can be annoying; but I'd also like to be able to send with one message a quote or comment to many blogs who are saying the same thing, and have also stated in advance that they want spam-like-mass-blasts from people like me who they want to hear from.

I would like to have an aggregator that searches for key words within the content and posts them to a screener. I'd like to be able to with one click auto-post this selected content to a blog.

I'd like to be able to have a search command in the aggregator where I can look for a specific phrase, reports this output to a central registry, create a message for that content, the system would verify the permission and post the content to that blog, and then allow me to follow-up with that content. I'd like to be able to link to follow-up messages, post one message in my blog, and have an auto-publication feature so that many blogs [who want to get spammed with this desirable content] get the message.

I'd like to have a prospective search platform to schedule time to do tasks and searches. It would show my search plan, and show the results of the searches in a horizontal program-like schedule, and track the %-complete on my research topics. Then I can identify which searches need additional work or variations. I don't want to make the aggregator simply lists of content; but a platform to monitor whether I'm achieving my research objectives. Ideally, it would be a "schedule planner combined with an aggregator." Ideally, I'd like to have words in a search showing how the request fits in with the overall research; while at the same time showing how the RSS-results compare with the overall research plan.

I'd like to have a feature in RSS-subscription URLs that allow me to maintain an existing URL-XML-subscription, but modify it so that I can block or ignore certain types of URLs or content publishers. I don't want to have to remember the details of the original publication; nor do I want to go back and individually update each XML-URL subscription; rather, I'd like to have within the aggregator a simple tic bock by each result that gives me the option to "ignore this source" and have the aggregator automatically exclude future results related to that subscription. Whether the actual XML-URL is modified, or whether this is a separate step within the aggregator is of no consequences; I just would rather have a fast way to exclude content from trashy-sources, rather than rework in my mind "which steps, commands, and phrases" I had in the original URL request, and then redo the XML-URL update.

I'd like an aggregator that formats content like a mini-internet. I'd like to be able to still scroll through my content results as if I'm looking at them as I would a search engine. I'd like to be able to have a "Google-Yahoo-search" like feature within my aggregator so that I can "surf my aggregator content" and "do some focused searching and link analysis on the aggregator results." This combines the features of a search engine with stats and linking analysis with the aggregators so that I can prioritize my reading, and narrow in on recurring themes, as opposed to simply scrolling through reams of aggregator-listed-content-results.


-- This is the end of the content --
Aggregators take information out of blogs and the net, and post them.

But what about the opposite: What if there was a way to use a blog, and extract information from an aggregator?

I'm talking about the settings in the aggregator, and including them as specific functions within the blog.

Then I can say, "I want to post all the content in my blog related to my aggregator selections."

Sometimes people say, "We can already do that." Small problem: There are some restrictions on spam.

So, I got to thinking: What if, rather than fighting spam, we use spam-like-features, and then use that to our advantage?

That way, like the MPAA and the relationship with music downloads, we can "use what people are doing," rather than trying to make them do something else.

I'd like to be able to use the blog as a platform to read the aggregator selections, and then apply these selections so that spammers [who have content that I want] will know to drop their content in my comments. IN short, I'm already telling my aggregator what I want; why not tell the world and the spammers [who may also have stuff I want] the specific codes on how to spam my blog [and give me what I want?].

I'd also like to be able to look at a map, and notice which geographic regions are not getting my content. Why not have a blog that allows the spam to be targeted based on the content, topic, novelty, and size.

Then I can look for geographic camps in the downloads and spam, and identify which areas I want to focus on "using other methods" to get content.

This would be like giving the audience the choice on which content they want to get spammed with [if that's what they want].

Also, I'd like to be able to do RSS searches in particular PDF-formatting, not just a generic pdf-extention. I'd like to be able to have a tool that can read the layout codes, and then organize the results into specific folders based on style, color, layout, length, novelty, tone, and other parameters.

I'd also like to be able to take blog content, and auto-format it to a pdf.layout. I'd like to be able to include simple commands in the blog, and have as a draft in my blog, a visible pdf-like rendition of the content as it would appear in a pdf file including color and formatting. I'd like to be able to publish the content as text; but have my readers/aggregators display the output as either text or a PDF in the aggregator, depending on what the reader wanted to see.

I'd also like to see an RSS download monitor. I want to know which geographic regions the content is downloaded, and have this in a summary chart that is in the aggregator: This would be an RSS-data flow between the blog, site stats, and the aggregator. I want to know which of my content is getting downloaded and then I can tailor content that is similar to that to that specific reader, and then adjust my content codes so that I can identify which of my readers has a "give me spam like this"-command in their blog-comments. Then, I can click a few tic boxes to update and have my content automatically tailored to the updates in their "give me spam"-commands.

I'd like a capture command that not only identifies the style of words about specific subjects, but captures the original quote/comment and posts it to the blog in a pdf compatible format. Also, the reverse would be true: I'd like to be able to do a search with the RSS-tool, identify key comments or phrases, and then have my aggregator find which blogs have a "give me spam command" [a permission to make a follow-up to that quote] so that with one simple approval, I can have a permission-based-spam sent to those who want it. Yes, spam can be annoying; but I'd also like to be able to send with one message a quote or comment to many blogs who are saying the same thing, and have also stated in advance that they want spam-like-mass-blasts from people like me who they want to hear from.

I would like to have an aggregator that searches for key words within the content and posts them to a screener. I'd like to be able to with one click auto-post this selected content to a blog.

I'd like to be able to have a search command in the aggregator where I can look for a specific phrase, reports this output to a central registry, create a message for that content, the system would verify the permission and post the content to that blog, and then allow me to follow-up with that content. I'd like to be able to link to follow-up messages, post one message in my blog, and have an auto-publication feature so that many blogs [who want to get spammed with this desirable content] get the message.

I'd like to have a prospective search platform to schedule time to do tasks and searches. It would show my search plan, and show the results of the searches in a horizontal program-like schedule, and track the %-complete on my research topics. Then I can identify which searches need additional work or variations. I don't want to make the aggregator simply lists of content; but a platform to monitor whether I'm achieving my research objectives. Ideally, it would be a "schedule planner combined with an aggregator." Ideally, I'd like to have words in a search showing how the request fits in with the overall research; while at the same time showing how the RSS-results compare with the overall research plan.

I'd like to have a feature in RSS-subscription URLs that allow me to maintain an existing URL-XML-subscription, but modify it so that I can block or ignore certain types of URLs or content publishers. I don't want to have to remember the details of the original publication; nor do I want to go back and individually update each XML-URL subscription; rather, I'd like to have within the aggregator a simple tic bock by each result that gives me the option to "ignore this source" and have the aggregator automatically exclude future results related to that subscription. Whether the actual XML-URL is modified, or whether this is a separate step within the aggregator is of no consequences; I just would rather have a fast way to exclude content from trashy-sources, rather than rework in my mind "which steps, commands, and phrases" I had in the original URL request, and then redo the XML-URL update.

I'd like an aggregator that formats content like a mini-internet. I'd like to be able to still scroll through my content results as if I'm looking at them as I would a search engine. I'd like to be able to have a "Google-Yahoo-search" like feature within my aggregator so that I can "surf my aggregator content" and "do some focused searching and link analysis on the aggregator results." This combines the features of a search engine with stats and linking analysis with the aggregators so that I can prioritize my reading, and narrow in on recurring themes, as opposed to simply scrolling through reams of aggregator-listed-content-results.


-- This is the end of the content --
" />